Month: July 2024
Blank Report Notes View
Hello,
How can I view the entirety of a note on the blank report?
The far right column is only showing first line of any notes against tasks.
Thank you
Hello,How can I view the entirety of a note on the blank report?The far right column is only showing first line of any notes against tasks.Thank you Read More
What is the Quick___Books Connection Diagnostic Tool?
The Quick____Books Connection Diagnostic Tool is a utility provided by Intuit that helps diagnose and resolve network connectivity issues, database errors, and multi-user mode problems in Quick____Books Desktop. This tool is designed to identify and fix issues that might occur when trying to connect to the Quick___Books company file. More information you can Contact our Quick__Books ProAdvisor
The Quick____Books Connection Diagnostic Tool is a utility provided by Intuit that helps diagnose and resolve network connectivity issues, database errors, and multi-user mode problems in Quick____Books Desktop. This tool is designed to identify and fix issues that might occur when trying to connect to the Quick___Books company file. More information you can Contact our Quick__Books ProAdvisor Read More
Group expiration policy – what criteria does the policy use to determine when a group expires?
We are considering enabling group expiration on 80k+ groups. Before we do so we would like to understand which groups will be affected. There isn’t any documentation that I could find on which criteria the group expiration policy uses, and when we opened a case with Microsoft they weren’t able to tell us much other than ‘check audit logs’, which doesn’t help since that only keeps data for 90 days.
Does anyone have any insight on this one? Maybe a way to run a ‘what-if’ scenario before we kick off the policy? Thanks!
We are considering enabling group expiration on 80k+ groups. Before we do so we would like to understand which groups will be affected. There isn’t any documentation that I could find on which criteria the group expiration policy uses, and when we opened a case with Microsoft they weren’t able to tell us much other than ‘check audit logs’, which doesn’t help since that only keeps data for 90 days. Does anyone have any insight on this one? Maybe a way to run a ‘what-if’ scenario before we kick off the policy? Thanks! Read More
n Premise Intelligent recap needed due to national Regulation
Hello
I am currently working as an architect for a government. For legal reasons, it is not possible for us to use the Meeting intelligent recap functions in Microsoft Teams Premium. The government in question does not want the voice content of meetings to be analyzed by AI bodies located outside its territory.
As a result, we’re looking for a way to enable voice content to be captured on premise during meetings, to be analyzed locally and to provide an intelligent recap service on premise.
Naturally, we’ve contacted a number of vendors, most of whom are very cloud-oriented.
To date, we don’t know whether we wouldn’t be able to capture meeting audio streams via bot development, for example.
Have you had to deal with this kind of problem? Or do you know of any third-party solutions that address this type of issue?
Thank you in advance.
Sincerely
Hello I am currently working as an architect for a government. For legal reasons, it is not possible for us to use the Meeting intelligent recap functions in Microsoft Teams Premium. The government in question does not want the voice content of meetings to be analyzed by AI bodies located outside its territory. As a result, we’re looking for a way to enable voice content to be captured on premise during meetings, to be analyzed locally and to provide an intelligent recap service on premise. Naturally, we’ve contacted a number of vendors, most of whom are very cloud-oriented.To date, we don’t know whether we wouldn’t be able to capture meeting audio streams via bot development, for example. Have you had to deal with this kind of problem? Or do you know of any third-party solutions that address this type of issue? Thank you in advance. Sincerely
Read More
Datedif with hired date term and rehire
=(DATEDIF(E26,F26,”Y”)+DATEDIF(K26,$L$2,”Y”))&” Y “& (DATEDIF(E26,F26,”YM”)+DATEDIF(K26,$L$2,”YM”))&” M”
This is giving me months greater than 12. How do I correct this?
=(DATEDIF(E26,F26,”Y”)+DATEDIF(K26,$L$2,”Y”))&” Y “& (DATEDIF(E26,F26,”YM”)+DATEDIF(K26,$L$2,”YM”))&” M”This is giving me months greater than 12. How do I correct this? Read More
How to set repeating times?
I’ve got a bookings calendar which is linked to multiple colleagues calendars to allow bookings to be made in blocks of 90 minutes (10am, 11:30, 13:00, 14:30, 16:00). These 5 times are the slots we would like to show.
However, with some staff, if they have other meetings/ appointments in their calendar, it shows the only other available 90 minutes they have free (e.g. 13:55).
Is there a way to only allow the times we want, and if they are busy at those times display them as not available instead of a random time during the day?
Thanks
I’ve got a bookings calendar which is linked to multiple colleagues calendars to allow bookings to be made in blocks of 90 minutes (10am, 11:30, 13:00, 14:30, 16:00). These 5 times are the slots we would like to show. However, with some staff, if they have other meetings/ appointments in their calendar, it shows the only other available 90 minutes they have free (e.g. 13:55). Is there a way to only allow the times we want, and if they are busy at those times display them as not available instead of a random time during the day? Thanks Read More
Enter different price depending on rate and job number
Hi, Good day all,
I am trying to create a quick search for invoice where I have to select between Day or Night rate.
From there I need the price to appear(from database) and the price need to be displayed based on the job number inserted in another cell.
Eg, the price list is from another sheet consist of two different rate (day & night).
1. I want to select Day Rate.
2. I insert the Job no (106030)
3. The price will show (Day Rate for job 106030) pulling the data from the aforementioned price sheet.
Best Regards,
Hi, Good day all, I am trying to create a quick search for invoice where I have to select between Day or Night rate.From there I need the price to appear(from database) and the price need to be displayed based on the job number inserted in another cell. Eg, the price list is from another sheet consist of two different rate (day & night).1. I want to select Day Rate.2. I insert the Job no (106030)3. The price will show (Day Rate for job 106030) pulling the data from the aforementioned price sheet. Best Regards, Read More
Infra as Code vs VMSS
Guys, I always wondered… what is the difference between VMSS and create VMs via IAC?
Which one is better?
Does any of one have more advantages than the other?
Guys, I always wondered… what is the difference between VMSS and create VMs via IAC?Which one is better?Does any of one have more advantages than the other? Read More
Unlock Unprecedented Value with Azure AI Document Intelligence Custom Extraction, now 40% Less!
Revolutionizing Document Processing for Your Business
At Azure AI, we continuously strive to empower businesses with cutting-edge technology at the most competitive prices. We are thrilled to announce that as of June 1st, the pricing of our custom extraction models, a key capability of Azure AI Document Intelligence is significantly reduced. With a 40% price drop, we are making it easier and more affordable for you to leverage the full potential of AI-driven document processing.
Custom Extraction
With Azure AI Document Intelligence, you can use pre-built models or build your own custom models to analyze and extract key information and insights from your documents, such as names, dates, amounts, addresses, signatures, and more. Custom Extraction allows you to build your own custom models to extract defined schema from your documents. With the Custom Extraction model, you can use the Document Intelligence Studio or the REST API to define the fields and label the values that you want to extract from your documents and train your model starting with just one sample document. Once the model is trained, you can use it to analyze new documents and get the extracted data in JSON format or use Document Intelligence SDK to get the analyzed result. Custom Extraction is ideal for scenarios where you have documents that are not supported by the pre-built models, or where you have specific business requirements that need custom logic or validation.
Discover New Pricing
We are excited to announce that we have reduced the price of the Custom Extraction model by 40%, from $50/1,000 pages to $30/1,000 pages. The new pricing is effective from June 1st, 2024, and applies to all regions where the Custom Extraction model is available.
Commitment Tiers for Even Greater Savings – In addition to the price reduction, we are also offering lower prices for commitment tiers. By pre-committing to a specific capacity, you can enjoy even greater savings. This flexible pricing structure is designed to accommodate your business needs, providing you with cost-effective solutions that scale with your operations.
We invite you to explore the new pricing details and see how much you can save with Azure AI Document Intelligence. For more information, please visit our pricing page.
Maximize Efficiency and Expand Possibilities
This price reduction means you can now process more documents than ever before without breaking the bank. Whether you need to extract data from forms, financial records, or any other document type, our custom extraction models are designed to deliver precise and reliable results with confidence scores. The substantial savings allow you to allocate your budget more effectively, ensuring you get the most out of your investment in AI technology. Tasks that were previously considered too costly can now be efficiently handled with our custom extraction models. This opens up a world of possibilities for businesses of all sizes. From small enterprises looking to automate routine tasks to large corporations aiming to optimize complex workflows, everyone can benefit from the enhanced affordability of Azure AI Document Intelligence.
Why Choose Azure AI Document Intelligence?
Accuracy and Reliability: Our AI models are built to deliver accurate data extraction, reducing errors and improving efficiency.
Scalability: Easily scale your document processing capabilities to meet the growing demands of your business.
Customizability: Tailor our extraction models to your specific requirements, ensuring the perfect fit for your unique workflows.
Confidence scores: Maximize efficiency and minimize costs in automation workflows, leveraging confidence scores.
Cost Efficiency: With our new pricing, enjoy the best-in-class AI technology at a fraction of the cost.
Get Started!
If you are new to Azure AI Document Intelligence, you can get started by signing up for a free Azure account and getting $200 credit to explore any Azure service for 30 days. You can also use the free tier of Azure AI Document Intelligence, which offers 500 pages per month for the Custom Extraction model, and 2,000 pages per month for the pre-built models. To learn how to use the Custom Extraction model, you can follow the instructions on QuickStart Guide and refer to the sample code for the Custom Extraction. You can now extract data from your documents with less processing cost and accelerate your digital transformation with Azure AI. Start building your custom models today!
Read more
Custom document models – Document Intelligence – Azure AI services | Microsoft Learn
Azure AI Document Intelligence overview
Azure AI Document Intelligence sample repo
Microsoft Tech Community – Latest Blogs –Read More
The Rising Significance of APIs – Azure API Management & API Center
As we venture deeper into the digital era, APIs (Application Programming Interfaces) have become the cornerstone of modern software development and digital communication. APIs continue to be pivotal, acting as the conduits through which different systems, applications, and devices interact and exchange data. This growing reliance on APIs is reflected in the investment trends, with a significant 92% of global respondents indicating that investments in APIs will either remain steady or increase over the next year (2023 State of the API Report, Postman, 2023). Recognizing the critical role that APIs play in modern software architecture, Microsoft has been consistently investing in and expanding its API suite to fulfill diverse needs. Managing APIs is not a one-size-fits-all solution with different API ecosystems requiring different API management approaches and tools. Azure API Center, a new API Azure service, was recently announced General Availability (GA). Azure API Center is engineered to function independently, yet it seamlessly integrates with Azure API Management, providing customers options to manage various aspects of their API ecosystem.
Azure API Management: Your Gateway to Digital Transformation
Azure API Management (APIM) is a managed cloud service designed to streamline and secure the use of APIs. API Management acts as a secure front door to facilitate, manage, and analyze the interactions between an organization’s APIs and their users with some of their core functionalities listed below:
API gateway – operational management: API Management acts as a gateway, managing API exposure, security, and analytics during runtime.
Traffic routing: Acts as a facade to backend services by accepting API calls and routing them to appropriate backends.
API access: Verifies API keys and other credentials such as JWT tokens and certificates presented with requests to access APIs published through an API Management instance.
Operational stability: API Management allows you to enforce usage quotas and rate limits to manage the flow of requests to your APIs effectively to prevent API overuse. It also validates requests and responses in compliance with the specification – e.g., JSON and XML validation, validation of headers and query params.
Request transformation: The rich policy engine of API Management allows you to modify incoming and outgoing requests to your needs with more than 60 built-in policies and the option to build your own custom policies.
API logging: API Management provides the capability to emit logs, metrics, and traces, which are essential for monitoring, reporting, and troubleshooting your APIs.
Self-hosted gateway: API Management also offers self-hosted gateway capabilities, a containerized version of the default managed gateway to place your gateways in the same environments where you host your APIs.
Developer Portal: API Management features a developer portal, which can be generated automatically and is a fully customizable website with the documentation of your APIs. It facilitates API discovery, testing, and consumption by internal and external developers.
Azure API Center: Your Inventory for API Lifecycle Management
API inventory management: API Center allows you to register all your organization’s APIs in a centralized inventory, regardless of their type, lifecycle stage, or deployment location, for better tracking and accessibility.
Tackling API sprawl: APIs’ runtime might be managed in multiple different API Management services, multiple different API gateways from different vendors, or are unmanaged at all. Azure API Center allows you to develop and maintain a structured API inventory.
Holistic API view: While API Management excels in runtime API mediation, its inventory management capabilities are limited to the types of APIs that are supported at runtime and to the versions that are actively managed in runtime. Azure API Center supports any kind of API types, such as AsyncAPIs, and you can easily track APIs across different deployment environments.
Real-world API representation: You can add detailed information about each API, including versions, definitions, custom metadata, and associate them with deployment environments (e.g. Dev, Test, Production).
API governance: Azure API Center provides tools to organize and filter APIs using metadata, set up linting and analysis to check API design consistency for better conformance on API style guidelines. Additionally, shift-left API compliance to API teams to ensure that developers can more productively and efficiently create compliant APIs.
API discovery and reuse: It enables internal developers and API program managers to discover APIs through the Azure portal, an API Center portal, and developer tools, including a Visual Studio Code extension.
Navigating Your API Ecosystem – Example Scenarios for Azure API Center and API Management
Azure API Management is geared towards runtime API governance and observability, focusing on the operational aspects of API management, such as securing, publishing, and analyzing APIs in use.
Azure API Center, in contrast, is tailored for design-time API governance, helping organizations to maintain a structured inventory of all APIs for better discovery and governance.
Note: The following scenarios are not mutually exclusive. They have been separated for better display and clarity.
Azure API Center and API Management Workspaces
Azure API Center will continue to improve the management of APIs across different API Managment services, as organizations…
have APIM services across environments, for example, dev, test, and prod.
have more than one production APIM service in their company.
have APIm platforms from multiple vendors.
Conclusion
Microsoft Tech Community – Latest Blogs –Read More
Announcing Preview of New Azure Dlsv6, Dsv6, Esv6 VMs with new CPU, Azure Boost, and NVMe Support
Co-authored by Andy Jia, Principal Product Manager, and Misha Bansal, Technical Program Manager, Azure Compute
We are thrilled to announce Preview of new Azure General Purpose and Memory Optimized Virtual Machines powered by the latest 5th Generation Intel® Xeon® processor. The new VMs come with three different memory-to-core ratios and offer options with and without local SSD across the VM families: the General Purpose Dlsv6, Dldsv6, Dsv6, and Ddsv6 series and the Memory Optimized Esv6 and Edsv6 series.
These VMs deliver several important improvements as compared to the previous generation (Dlv5, Dv5, Ev5) for a broad range of applications and workloads through technology innovations:
Up to 17% better performance
3X larger L3 cache
Up to 192vCPU and >18GiB of memory
Azure Boost which enables:
Up to 400k IOPS and 12 GB/s remote storage throughput
Up to 200 Gbps VM network bandwidth
46% larger local SSD capacity and >3X read IOPS
NVMe interface for local and remote disks
Enhanced security through Total Memory Encryption (TME) technology
General Purpose Workloads
The new Dlsv6-series and Dsv6 series VMs offer a balance of memory to CPU performance with increased scalability of up to 128 vCPUs and 512 GiB of RAM. These VMs work well for many general computing workloads, e-commerce systems, web front ends, desktop virtualization solutions, application servers, and more. Below is an overview of the specifications offered by the Dsv6-series and Dlsv6-series VMs.
Series
vCPU
Memory (GiB)
Local Disk (GiB)
Max Data Disks
Network Gbps
Dlsv6-series
2 – 128
4 – 256
n/a
8 – 64
12.5 –54.0
Dldsv6-series
2 – 128
4 – 256
110 – 7,040
8 – 64
12.5 –54.0
Dsv6-series
2 – 128
8 – 512
n/a
8 – 64
12.5 –54.0
Ddsv6-series
2 – 128
8 – 512
110 – 7,040
8 – 64
12.5 –54.0
Memory Intensive Workloads
The new Esv6-series and Edsv6-series virtual machines are ideal for memory-intensive workloads offering up to 192vCPU and >1800 GiB of RAM. These VMs are suitable to meet requirements associated with most enterprise applications, such as relational database servers, data warehousing workloads, business intelligence applications, in-memory analytics. Below is an overview of specifications offered by the Esv6-series and Edsv6-series VMs.
Series
vCPU
Memory (GiB)
Local Disk (GiB)
Max Data Disks
Network Gbps
Esv6-series
2 – 192
16 – >1800
n/a
8 – 64
12.5-200.0
Edsv6-series
2 – 192
16 – >1800
110 – 10,560
8 – 64
12.5-200.0
The Dlv6, Dv6, and Ev6-series Azure Virtual Machines will offer options with and without local disk storage. VM sizes with a local disk are denoted with a small “d” in the name. VM sizes without a local disk do not have the small “d”. Whether you choose a VM with a local disk or not, you can attach remote persistent disk such as Premium Disk v1, Premium Disk v2, or Ultra Disks to the VMs.
Join the Preview
Dlv6, Dv6, and Ev6 series VMs with less than 1TB of memory are now available for preview in US West and US East regions. VMs with 1 TB or above of memory will be added into the preview from Q3 2024.
If you are currently running Azure Dl/D/Ev5 or older Azure VM series, the Dl/D/E v6-series VMs will provide you with a better price-performance solution and increased scalability.
To request access to the preview, please fill out the survey form here. We look forward to hearing from you.
Microsoft Tech Community – Latest Blogs –Read More
password field is not taking input
In my ubuntu 22.04 I was installing matlab R2023a , during the process after filling email id the password field is not rasponding( not taking input)In my ubuntu 22.04 I was installing matlab R2023a , during the process after filling email id the password field is not rasponding( not taking input) In my ubuntu 22.04 I was installing matlab R2023a , during the process after filling email id the password field is not rasponding( not taking input) password filed is not taking input MATLAB Answers — New Questions
How do I install MATLAB and its toolboxes?
I would like to install MATLAB and/or its toolboxes on my internet-connected computer. How can I do this?I would like to install MATLAB and/or its toolboxes on my internet-connected computer. How can I do this? I would like to install MATLAB and/or its toolboxes on my internet-connected computer. How can I do this? MATLAB Answers — New Questions
Detect quantum wire dots (appear as circles) and isolate them from noisy background binarized
Hey All
I am working on a project involving photos from a live feed of quantum wire dots and I was wondering what might be the best approach for isolating the circles and the excited laser properly. My current code can sometimes lead to mistakes and non-circular formations. Any help would be much appreciated. I have shared my current code as well as an image comparison below!
I = "QD.jpg"
%loads image from a file into an array
Img = imread(I);
%grayscales image for easier analysis
ImgGray = im2gray(Img);
%makes a clear contrast between foreground and background
adjustedImg = imadjust(ImgGray);
%binarizing image into 0(black) and 1(white)
BwImg = imbinarize(adjustedImg, ‘adaptive’, ‘ForegroundPolarity’, ‘dark’, ‘Sensitivity’, 0.4);
%inverting it so that foreground appears as white and background as black
invertedBwImg = ~BwImg;
%morpholigically changing the image by opening and closing, in this case removing small white
%objects from black background and removing the remaining black spots from
%the white patches
se = strel("disk",15);
openedImg = imopen(invertedBwImg,se);
closedImg = imclose(openedImg,se);
% Apply a threshold to keep only pixels above the threshold value
thresholdedImg = ImgGray > 120;
% Multiply the original grayscale image by the threshold mask to keep pixel values
excitedimg = uint8(thresholdedImg) .* ImgGray;
% Label connected components
labeledImg = bwlabel(closedImg);
% Measure properties of image regions
stats = regionprops(labeledImg, ‘Eccentricity’);
% Define an eccentricity threshold
eccentricityThreshold = 0.76;
% Initialize a mask for the filtered image
filteredImg = ismember(labeledImg, find([stats.Eccentricity] < eccentricityThreshold));
% Convert logical image(filterImg) to uint8 image in order to comebine with
% excitedimage
uint8Img = uint8(filteredImg) * 255;
%final image with excited light and quantum wires
finalImg = uint8Img + excitedimg;
% Create a new figure for displaying images
figure;
% Display the original image
subplot(1, 2, 1);
imshow(Img);
title(‘Original Image’);
% Display the filtered binary image
subplot(1, 2, 2);
imshow(finalImg);
title(‘Updated Result’);
% Return the figure handle as the result
result = gcf;Hey All
I am working on a project involving photos from a live feed of quantum wire dots and I was wondering what might be the best approach for isolating the circles and the excited laser properly. My current code can sometimes lead to mistakes and non-circular formations. Any help would be much appreciated. I have shared my current code as well as an image comparison below!
I = "QD.jpg"
%loads image from a file into an array
Img = imread(I);
%grayscales image for easier analysis
ImgGray = im2gray(Img);
%makes a clear contrast between foreground and background
adjustedImg = imadjust(ImgGray);
%binarizing image into 0(black) and 1(white)
BwImg = imbinarize(adjustedImg, ‘adaptive’, ‘ForegroundPolarity’, ‘dark’, ‘Sensitivity’, 0.4);
%inverting it so that foreground appears as white and background as black
invertedBwImg = ~BwImg;
%morpholigically changing the image by opening and closing, in this case removing small white
%objects from black background and removing the remaining black spots from
%the white patches
se = strel("disk",15);
openedImg = imopen(invertedBwImg,se);
closedImg = imclose(openedImg,se);
% Apply a threshold to keep only pixels above the threshold value
thresholdedImg = ImgGray > 120;
% Multiply the original grayscale image by the threshold mask to keep pixel values
excitedimg = uint8(thresholdedImg) .* ImgGray;
% Label connected components
labeledImg = bwlabel(closedImg);
% Measure properties of image regions
stats = regionprops(labeledImg, ‘Eccentricity’);
% Define an eccentricity threshold
eccentricityThreshold = 0.76;
% Initialize a mask for the filtered image
filteredImg = ismember(labeledImg, find([stats.Eccentricity] < eccentricityThreshold));
% Convert logical image(filterImg) to uint8 image in order to comebine with
% excitedimage
uint8Img = uint8(filteredImg) * 255;
%final image with excited light and quantum wires
finalImg = uint8Img + excitedimg;
% Create a new figure for displaying images
figure;
% Display the original image
subplot(1, 2, 1);
imshow(Img);
title(‘Original Image’);
% Display the filtered binary image
subplot(1, 2, 2);
imshow(finalImg);
title(‘Updated Result’);
% Return the figure handle as the result
result = gcf; Hey All
I am working on a project involving photos from a live feed of quantum wire dots and I was wondering what might be the best approach for isolating the circles and the excited laser properly. My current code can sometimes lead to mistakes and non-circular formations. Any help would be much appreciated. I have shared my current code as well as an image comparison below!
I = "QD.jpg"
%loads image from a file into an array
Img = imread(I);
%grayscales image for easier analysis
ImgGray = im2gray(Img);
%makes a clear contrast between foreground and background
adjustedImg = imadjust(ImgGray);
%binarizing image into 0(black) and 1(white)
BwImg = imbinarize(adjustedImg, ‘adaptive’, ‘ForegroundPolarity’, ‘dark’, ‘Sensitivity’, 0.4);
%inverting it so that foreground appears as white and background as black
invertedBwImg = ~BwImg;
%morpholigically changing the image by opening and closing, in this case removing small white
%objects from black background and removing the remaining black spots from
%the white patches
se = strel("disk",15);
openedImg = imopen(invertedBwImg,se);
closedImg = imclose(openedImg,se);
% Apply a threshold to keep only pixels above the threshold value
thresholdedImg = ImgGray > 120;
% Multiply the original grayscale image by the threshold mask to keep pixel values
excitedimg = uint8(thresholdedImg) .* ImgGray;
% Label connected components
labeledImg = bwlabel(closedImg);
% Measure properties of image regions
stats = regionprops(labeledImg, ‘Eccentricity’);
% Define an eccentricity threshold
eccentricityThreshold = 0.76;
% Initialize a mask for the filtered image
filteredImg = ismember(labeledImg, find([stats.Eccentricity] < eccentricityThreshold));
% Convert logical image(filterImg) to uint8 image in order to comebine with
% excitedimage
uint8Img = uint8(filteredImg) * 255;
%final image with excited light and quantum wires
finalImg = uint8Img + excitedimg;
% Create a new figure for displaying images
figure;
% Display the original image
subplot(1, 2, 1);
imshow(Img);
title(‘Original Image’);
% Display the filtered binary image
subplot(1, 2, 2);
imshow(finalImg);
title(‘Updated Result’);
% Return the figure handle as the result
result = gcf; image processing, image segmentation, quantum dot MATLAB Answers — New Questions
i have a vector in lenght 5, i need to check if a sum of 2 or more elements in the vector is equal to another element at the same vector how to do that?
i have a vector in lenght 5, i need to check if a sum of 2 or more elements in the vector is equal to another element at the same vector how to do that?
i try to do a loop inside a loop but it doesnt works
if someone has an idea it would be helpful.
thank youi have a vector in lenght 5, i need to check if a sum of 2 or more elements in the vector is equal to another element at the same vector how to do that?
i try to do a loop inside a loop but it doesnt works
if someone has an idea it would be helpful.
thank you i have a vector in lenght 5, i need to check if a sum of 2 or more elements in the vector is equal to another element at the same vector how to do that?
i try to do a loop inside a loop but it doesnt works
if someone has an idea it would be helpful.
thank you vector, loop, for loop, if MATLAB Answers — New Questions
Truth Table Test Coverage Results Unclear
I have tested a 3-input truth table with all permutations of inputs (000 001 010 100 011 110 101 111) and not received condition coverage. It is unclear to me what the red letters mean in my coverage report. I have looked at this link and haven’t gotten much clarity. https://www.mathworks.com/help/slcoverage/ug/coverage-for-truth-tables.html
Here are the table results below. What is the meaning of the red letters in parenthesis? For action 2, it says (ok), meaning I clearly tested TTF, but, then it gives a red T in that column.
Thank you,
CodyI have tested a 3-input truth table with all permutations of inputs (000 001 010 100 011 110 101 111) and not received condition coverage. It is unclear to me what the red letters mean in my coverage report. I have looked at this link and haven’t gotten much clarity. https://www.mathworks.com/help/slcoverage/ug/coverage-for-truth-tables.html
Here are the table results below. What is the meaning of the red letters in parenthesis? For action 2, it says (ok), meaning I clearly tested TTF, but, then it gives a red T in that column.
Thank you,
Cody I have tested a 3-input truth table with all permutations of inputs (000 001 010 100 011 110 101 111) and not received condition coverage. It is unclear to me what the red letters mean in my coverage report. I have looked at this link and haven’t gotten much clarity. https://www.mathworks.com/help/slcoverage/ug/coverage-for-truth-tables.html
Here are the table results below. What is the meaning of the red letters in parenthesis? For action 2, it says (ok), meaning I clearly tested TTF, but, then it gives a red T in that column.
Thank you,
Cody test coverage, coverage, unit test, test manager, truth table, simulink MATLAB Answers — New Questions
upgrade OS for Sql server
Current our system have two note SQL server
OS version : 2016
Sql version : 2017
Always On : 2 nodes
Now we are planning upgrade OS for Sql server
OS version : 2022 (only change)
Sql version : 2017
Always On : join to AG
Can join OS 2022 to cluster always on OS 2016 ? We plan to join the new server to AG to make it secondary and sync data to make change new server to primary…
Current our system have two note SQL serverOS version : 2016Sql version : 2017Always On : 2 nodesNow we are planning upgrade OS for Sql serverOS version : 2022 (only change)Sql version : 2017 Always On : join to AGCan join OS 2022 to cluster always on OS 2016 ? We plan to join the new server to AG to make it secondary and sync data to make change new server to primary… Read More
Copying template.json from Export Template and deploy in bicep
I have a question, i want to deploy my resources (already created on Azure Portal) on my Azure Dev Ops Repo. My question is: is it legit to take the template.json from Export Template and then convert it into Bicep and then deploy with some changes on Azure Dev Ops?
Thank you very much!
I have a question, i want to deploy my resources (already created on Azure Portal) on my Azure Dev Ops Repo. My question is: is it legit to take the template.json from Export Template and then convert it into Bicep and then deploy with some changes on Azure Dev Ops? Thank you very much! Read More
Get attachments from list item
Hello,
We’ve been trying to get the attached files in list items using the API bellow:
https://graph.microsoft.com/v1.0/sites/{site-id}/lists/{list-id}/items/{item-id}
In the response, a “Attachments” field is present, but it only informs if the list contains file attached (true/false field), and no other information. We need to generate a download URL for these files.
Is there a way to get attached files information, such as download URL, from list items using the Graph API?
Thank you.
Hello, We’ve been trying to get the attached files in list items using the API bellow: https://graph.microsoft.com/v1.0/sites/{site-id}/lists/{list-id}/items/{item-id} In the response, a “Attachments” field is present, but it only informs if the list contains file attached (true/false field), and no other information. We need to generate a download URL for these files. Is there a way to get attached files information, such as download URL, from list items using the Graph API? Thank you. Read More
FORMULA Ranges keep changing even if I used a Named Range
I have been having issues with ranges changing even when they are fixed $A$7:$A$1600. So I created named ranges in the Name Manager. However, I am seeing those values change after releasing it to the business users.
I ultimately want to know how can I force these ranges to stay as they were designed?
The affected cell have #N/A in them because some of the ranges in the LET statement are different from others. Once they are re-aligned, then all is good.
Some examples:
These are how I set them up originally (exception is the fb_FamilyName where the it has a crazy value which was originally a PI_PackageEditable!$BK7. Not sure how that value go to be as big as it is. Given it is not fully locked down, I can see that it could change. The others however should not change but they do.
The way they were originally set up.
Added 5 rows and things changed.
And here is another that changed even though it has nothing to do with the worksheet I am working on
Here is an example of a formula making use of the named ranges. if one of the ranges moves out of synch with the others, then the cell shows a #N/A.
=LET(
centreBalloon, FILTER(pipkg_LongDesc_Rng, (pipkg_RecordType_Rng=”COMPONENT”)*(pipkg_PCODE_Rng=$D7), “”),
firstComponentRow, MATCH(1, (INDEX(centreBalloon,,1)=”COMPONENT”)*(INDEX(centreBalloon,,8)<>”COUPON_ADDON”), 0),
centreBalloonDesc, INDEX(centreBalloon, firstComponentRow, 11),
primeText, PROPER(centreBalloonDesc)&” Bouquet with “&”[COLOUR] [SIZE] [TYPE] Balloons – “,
pieceCount, SUMPRODUCT((INDEX(centreBalloon,,1)=”COMPONENT”)*(INDEX(centreBalloon,,8)<>”COUPON_ADDON”)*(INDEX(centreBalloon,,7))),
CONCAT(primeText, pieceCount, “pc”)
)
I have been having issues with ranges changing even when they are fixed $A$7:$A$1600. So I created named ranges in the Name Manager. However, I am seeing those values change after releasing it to the business users. I ultimately want to know how can I force these ranges to stay as they were designed? The affected cell have #N/A in them because some of the ranges in the LET statement are different from others. Once they are re-aligned, then all is good. Some examples:These are how I set them up originally (exception is the fb_FamilyName where the it has a crazy value which was originally a PI_PackageEditable!$BK7. Not sure how that value go to be as big as it is. Given it is not fully locked down, I can see that it could change. The others however should not change but they do. The way they were originally set up.Added 5 rows and things changed.And here is another that changed even though it has nothing to do with the worksheet I am working on Here is an example of a formula making use of the named ranges. if one of the ranges moves out of synch with the others, then the cell shows a #N/A. =LET(centreBalloon, FILTER(pipkg_LongDesc_Rng, (pipkg_RecordType_Rng=”COMPONENT”)*(pipkg_PCODE_Rng=$D7), “”),firstComponentRow, MATCH(1, (INDEX(centreBalloon,,1)=”COMPONENT”)*(INDEX(centreBalloon,,8)<>”COUPON_ADDON”), 0),centreBalloonDesc, INDEX(centreBalloon, firstComponentRow, 11),primeText, PROPER(centreBalloonDesc)&” Bouquet with “&”[COLOUR] [SIZE] [TYPE] Balloons – “,pieceCount, SUMPRODUCT((INDEX(centreBalloon,,1)=”COMPONENT”)*(INDEX(centreBalloon,,8)<>”COUPON_ADDON”)*(INDEX(centreBalloon,,7))),CONCAT(primeText, pieceCount, “pc”)) Read More