Category: News
Comparing Microsoft Cloud Email Services
HVE and ECS are two competing Microsoft Cloud Email Services. At least, they seem to compete. In reality, HVE and ECS serve different target audiences. HVE is all about internal email services for apps and devices while ECS is for high volume external mailings like customer newsletters. We tested both services by sending subscription reminder notifications to Office 365 for IT Pros readers.
https://office365itpros.com/2024/08/13/microsoft-cloud-email-services/
HVE and ECS are two competing Microsoft Cloud Email Services. At least, they seem to compete. In reality, HVE and ECS serve different target audiences. HVE is all about internal email services for apps and devices while ECS is for high volume external mailings like customer newsletters. We tested both services by sending subscription reminder notifications to Office 365 for IT Pros readers.
https://office365itpros.com/2024/08/13/microsoft-cloud-email-services/ Read More
Asynchronous HTTP APIs with Azure Container Apps jobs
When building HTTP APIs, it can be tempting to synchronously run long-running tasks in a request handler. This approach can lead to slow responses, timeouts, and resource exhaustion. If a request times out or a connection is dropped, the client won’t know if the operation completed or not. For CPU-bound tasks, this approach can also bog down the server, making it unresponsive to other requests.
In this post, we’ll look at how to build an asynchronous HTTP API with Azure Container Apps. We’ll create a simple API that implements the Asynchronous Request-Reply pattern: with the API hosted in a container app and the asynchronous processing done in a job. This approach provides a much more robust and scalable solution for long-running tasks.
Long-running API requests in Azure Container Apps
Azure Container Apps is a serverless container platform. It’s ideal for hosting a variety of workloads, including HTTP APIs.
Like other serverless and PaaS platforms, Azure Container Apps is designed for short-lived requests — its ingress currently has a maximum timeout of 4 minutes. As an autoscaling platform, it’s designed to scale dynamically based on the number of incoming requests. When scaling in, replicas are removed. Long-running requests can terminate abruptly if the replica handling the request is removed.
Azure Container Apps jobs
Azure Container Apps has two types of resources: apps and jobs. Apps are long-running services that respond to HTTP requests or events. Jobs are tasks that run to completion and can be triggered by a schedule or an event.
Jobs can also be triggered programmatically. This makes them a good fit for implementing asynchronous processing in an HTTP API. The API can start a job execution to process the request and return a response immediately. The job can then take as long as it needs to complete the processing. The client can poll a status endpoint on the app to check if the job has completed and get the result.
The Asynchronous Request-Reply pattern
Asynchronous Request-Reply is a common pattern for handling long-running operations in HTTP APIs. Instead of waiting for the operation to complete, the API returns a status code indicating that the operation has started. The client can then poll the API to check if the operation has completed.
Here’s how the pattern applies to Azure Container Apps:
The client sends a request to the API (hosted as a container app) to start the operation.
The API saves the request (our example uses Azure Cosmos DB), starts a job to process the operation, and returns a 202 Accepted status code with a Location header pointing to a status endpoint.
The client polls the status endpoint. While the operation is in progress, the status endpoint returns a 200 OK status code with a Retry-After header indicating when the client should poll again.
When the operation is complete, the status endpoint returns a 303 See Other status code with a Location header pointing to the result. The client automatically follows the redirect to get the result.
Async HTTP API app
You can find the source code in this GitHub repository.
The API is a simple Node.js app that uses Fastify. It demonstrates how to build an async HTTP API that accepts orders and offloads the processing of the orders to jobs. The app has a few simple endpoints.
POST /orders
This endpoint accepts an order in its body. It saves the order to Cosmos DB with a status of “pending” and starts a job execution to process the order.
fastify.post(‘/orders’, async (request, reply) => {
const orderId = randomUUID()
// save order to Cosmos DB
await container.items.create({
id: orderId,
status: ‘pending’,
order: request.body,
})
// start job execution
await startProcessorJobExecution(orderId)
// return 202 Accepted with Location header
reply.code(202).header(‘Location’, ‘/orders/status/’ + orderId).send()
})
We’ll take a look at the job later in this article. In the above code snippet, startProcessorJobExecution is a function that starts the job execution. It uses the Azure Container Apps management SDK to start the job.
const credential = new DefaultAzureCredential()
const containerAppsClient = new ContainerAppsAPIClient(credential, subscriptionId)
// …
async function startProcessorJobExecution(orderId) {
// get the existing job’s template
const { template: processorJobTemplate } =
await containerAppsClient.jobs.get(resourceGroupName, processorJobName)
// add the order ID to the job’s environment variables
const environmentVariables = processorJobTemplate.containers[0].env
environmentVariables.push({ name: ‘ORDER_ID’, value: orderId })
const jobStartTemplate = { template: processorJobTemplate }
// start the job execution with the modified template
const jobExecution = await containerAppsClient.jobs.beginStartAndWait(
resourceGroupName, processorJobName, {
template: processorJobTemplate,
}
)
}
The job takes the order ID as an environment variable. To set the environment variable, we start the job execution with a modified template that includes the order ID.
We use managed identities to authenticate with both the Azure Container Apps management SDK and the Cosmos DB SDK.
GET /orders/status/:orderId
The previous endpoint returns a 202 Accepted status code with a Location header pointing to this status endpoint. The client can poll this endpoint to check the status of the order.
This request handler retrieves the order from Cosmos DB. If the order is still pending, it returns a 200 OK status code with a Retry-After header indicating when the client should poll again. If the order is complete, it returns a 303 See Other status code with a Location header pointing to the result.
fastify.get(‘/orders/status/:orderId’, async (request, reply) => {
const { orderId } = request.params
// get the order from Cosmos DB
const { resource: item } = await container.item(orderId, orderId).read()
if (item === undefined) {
reply.code(404).send()
return
}
if (item.status === ‘pending’) {
reply.code(200).headers({
‘Retry-After’: 10,
}).send({ status: item.status })
} else {
reply.code(303).header(‘Location’, ‘/orders/’ + orderId).send()
}
})
GET /orders/:orderId
This endpoint returns the result of the order processing. The status endpoint redirects to this resource when the order is complete. It retrieves the order from Cosmos DB and returns it.
fastify.get(‘/orders/:orderId’, async (request, reply) => {
const { orderId } = request.params
// get the order from Cosmos DB
const { resource: item } = await container.item(orderId, orderId).read()
if (item === undefined || item.status === ‘pending’) {
reply.code(404).send()
return
}
if (item.status === ‘completed’) {
reply.code(200).send({ id: item.id, status: item.status, order: item.order })
} else if (item.status === ‘failed’) {
reply.code(500).send({ id: item.id, status: item.status, error: item.error })
}
})
Order processor job
The order processor job is a another Node.js app. As it’s just a demo, it just waits a while, updates the order status in Cosmos DB, and exits. In a real-world scenario, the job would process the order, update the order status, and possibly send a notification.
We deploy it as a job in Azure Container Apps. The POST /orders endpoint above starts the job execution. The job takes the order ID as an environment variable and uses it to update the order status in Cosmos DB.
Like the API app, the job uses managed identities to authenticate with Azure Cosmos DB.
The code is in the same GitHub repository.
import { DefaultAzureCredential } from ‘@azure/identity’
import { CosmosClient } from ‘@azure/cosmos’
const credential = new DefaultAzureCredential()
const client = new CosmosClient({
endpoint: process.env.COSMOSDB_ENDPOINT,
aadCredentials: credential
})
const database = client.database(‘async-api’)
const container = database.container(‘statuses’)
const orderId = process.env.ORDER_ID
const orderItem = await container.item(orderId, orderId).read()
const orderResource = orderItem.resource
if (orderResource === undefined) {
console.error(‘Order not found’)
process.exit(1)
}
// simulate processing time
const orderProcessingTime = Math.floor(Math.random() * 30000)
console.log(`Processing order ${orderId} for ${orderProcessingTime}ms`)
await new Promise(resolve => setTimeout(resolve, orderProcessingTime))
// update order status in Cosmos DB
orderResource.status = ‘completed’
orderResource.order.completedAt = new Date().toISOString()
await orderItem.item.replace(orderResource)
console.log(`Order ${orderId} processed`)
HTTP client
To call the API and wait for the result, here’s a simple JavaScript function that works just like fetch but waits for the job to complete. It also accepts a callback function that’s called each time the status endpoint is polled so you can log the status or update the UI.
async function fetchAndWait() {
const input = arguments[0]
let init = arguments[1]
let onStatusPoll = arguments[2]
// if arguments[1] is not a function
if (typeof init === ‘function’) {
init = undefined
onStatusPoll = arguments[1]
}
onStatusPoll = onStatusPoll || (async () => {})
// make the initial request
const response = await fetch(input, init)
if (response.status !== 202) {
throw new Error(`Something went wrongnResponse: ${await response.text()}n`)
}
const responseOrigin = new URL(response.url).origin
let statusLocation = response.headers.get(‘Location’)
// if the Location header is not an absolute URL, construct it
statusLocation = new URL(statusLocation, responseOrigin).href
// poll the status endpoint until it’s redirected to the final result
while (true) {
const response = await fetch(statusLocation, {
redirect: ‘follow’
})
if (response.status !== 200 && !response.redirected) {
const data = await response.json()
throw new Error(`Something went wrongnResponse: ${JSON.stringify(data, null, 2)}n`)
}
// redirected, return final result and stop polling
if (response.redirected) {
const data = await response.json()
return data
}
// the Retry-After header indicates how long to wait before polling again
const retryAfter = parseInt(response.headers.get(‘Retry-After’)) || 10
// call the onStatusPoll callback so we can log the status or update the UI
await onStatusPoll({
response,
retryAfter,
})
await new Promise(resolve => setTimeout(resolve, retryAfter * 1000))
}
}
To use the function, we call it just like fetch. We pass an additional argument that’s a callback function that’s invoked each time the status endpoint is polled.
const order = await fetchAndWait(‘/orders’, {
method: ‘POST’,
headers: {
‘Content-Type’: ‘application/json’
},
body: JSON.stringify({
“customer”: “Contoso”,
“items”: [
{
“name”: “Apple”,
“quantity”: 5
},
{
“name”: “Banana”,
“quantity”: 3
},
],
})
}, async ({ response, retryAfter }) => {
const { status } = await response.json()
const requestUrl = response.url
messagesDiv.innerHTML += `Order status: ${status}; retrying in ${retryAfter} seconds (${requestUrl})n`
})
// display the final result
document.querySelector(‘#order’).innerHTML = JSON.stringify(order, null, 2)
If we run this in the browser, we can open up dev tools and see all the HTTP requests that are made.
In the portal, we also can see the job execution history.
Conclusion
With the Asynchronous Request-Reply pattern, we can build robust and scalable HTTP APIs that handle long-running operations. By using Azure Container Apps jobs, we can offload the processing to a job execution that doesn’t consume resources from the API app. This robust approach allows the API to respond quickly and handle many requests concurrently.
Originally posted on anthonychu.ca
Microsoft Tech Community – Latest Blogs –Read More
Is there any utility to display message or variables in custom criteria script of simulink test ?
In custom criteria script of simulink test, the disp() doesnt work and so it is very hard to debug the script. I had to use error() and it doesnt support most of the types like structure, cell etc. Is there any utility available for debuging the script in custom criteria script ?In custom criteria script of simulink test, the disp() doesnt work and so it is very hard to debug the script. I had to use error() and it doesnt support most of the types like structure, cell etc. Is there any utility available for debuging the script in custom criteria script ? In custom criteria script of simulink test, the disp() doesnt work and so it is very hard to debug the script. I had to use error() and it doesnt support most of the types like structure, cell etc. Is there any utility available for debuging the script in custom criteria script ? simulink test, custom criteria script MATLAB Answers — New Questions
HDL Cosimulation with Cadence Xcelium setup
When running the example of "GettingStartedWithSimulinkHDLCosimExample" with Cadence Xcelium , I get these following messages.
Executing nclaunch tclstart commands…
xmsim(64): 22.09-s004: (c) Copyright 1995-2022 Cadence Design Systems, Inc.
xmsim: *W,NOMTDGUI: Multi-Threaded Dumping is disabled for interactive debug mode.
xmsim: *E,STRPIN: Could not initialize SimVision connection: SimVision/Indago process terminated before a connection was established.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
/tools/cds/xceliummain_22.09.004_Linux/tools.lnx86/simvision/bin/64bit/simvision: line 48: cds_plat: command not found
/tools/cds/xceliummain_22.09.004_Linux/tools.lnx86/simvision/bin/64bit/simvision: line 101: /tools/cds/xceliummain_22.09.004_Linux/tools./simvision/bin/64bit/simvision.exe: No such file or directory
SimVision/Indago process terminated before a connection could be established.
while executing
"exec <@stdin >@stdout xmsim -gui rcosflt_rtl -64bit -input {@simvision {set w [waveform new]}} -input {@simvision {waveform add -using $w -signals rco…"
("uplevel" body line 1)
invoked from within
"uplevel 1 [join $args]"
(procedure "hdlsimulink" line 22)
invoked from within
"hdlsimulink rcosflt_rtl -64bit -socket 44014 -input "{@simvision {set w [waveform new]}}" -input "{@simvision {waveform add -using $w -signals rc…"
(file "compile_and_launch.tcl" line 66)
ERROR hit any key to exit xterm
Could you please guide me how to set up cosimulation with Cadence Xcelium?When running the example of "GettingStartedWithSimulinkHDLCosimExample" with Cadence Xcelium , I get these following messages.
Executing nclaunch tclstart commands…
xmsim(64): 22.09-s004: (c) Copyright 1995-2022 Cadence Design Systems, Inc.
xmsim: *W,NOMTDGUI: Multi-Threaded Dumping is disabled for interactive debug mode.
xmsim: *E,STRPIN: Could not initialize SimVision connection: SimVision/Indago process terminated before a connection was established.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
/tools/cds/xceliummain_22.09.004_Linux/tools.lnx86/simvision/bin/64bit/simvision: line 48: cds_plat: command not found
/tools/cds/xceliummain_22.09.004_Linux/tools.lnx86/simvision/bin/64bit/simvision: line 101: /tools/cds/xceliummain_22.09.004_Linux/tools./simvision/bin/64bit/simvision.exe: No such file or directory
SimVision/Indago process terminated before a connection could be established.
while executing
"exec <@stdin >@stdout xmsim -gui rcosflt_rtl -64bit -input {@simvision {set w [waveform new]}} -input {@simvision {waveform add -using $w -signals rco…"
("uplevel" body line 1)
invoked from within
"uplevel 1 [join $args]"
(procedure "hdlsimulink" line 22)
invoked from within
"hdlsimulink rcosflt_rtl -64bit -socket 44014 -input "{@simvision {set w [waveform new]}}" -input "{@simvision {waveform add -using $w -signals rc…"
(file "compile_and_launch.tcl" line 66)
ERROR hit any key to exit xterm
Could you please guide me how to set up cosimulation with Cadence Xcelium? When running the example of "GettingStartedWithSimulinkHDLCosimExample" with Cadence Xcelium , I get these following messages.
Executing nclaunch tclstart commands…
xmsim(64): 22.09-s004: (c) Copyright 1995-2022 Cadence Design Systems, Inc.
xmsim: *W,NOMTDGUI: Multi-Threaded Dumping is disabled for interactive debug mode.
xmsim: *E,STRPIN: Could not initialize SimVision connection: SimVision/Indago process terminated before a connection was established.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
ERROR: ld.so: object ‘/tools/matlab/R2023bU1/sys/os/glnxa64/libstdc++.so.6’ from LD_PRELOAD cannot be preloaded: ignored.
/tools/cds/xceliummain_22.09.004_Linux/tools.lnx86/simvision/bin/64bit/simvision: line 48: cds_plat: command not found
/tools/cds/xceliummain_22.09.004_Linux/tools.lnx86/simvision/bin/64bit/simvision: line 101: /tools/cds/xceliummain_22.09.004_Linux/tools./simvision/bin/64bit/simvision.exe: No such file or directory
SimVision/Indago process terminated before a connection could be established.
while executing
"exec <@stdin >@stdout xmsim -gui rcosflt_rtl -64bit -input {@simvision {set w [waveform new]}} -input {@simvision {waveform add -using $w -signals rco…"
("uplevel" body line 1)
invoked from within
"uplevel 1 [join $args]"
(procedure "hdlsimulink" line 22)
invoked from within
"hdlsimulink rcosflt_rtl -64bit -socket 44014 -input "{@simvision {set w [waveform new]}}" -input "{@simvision {waveform add -using $w -signals rc…"
(file "compile_and_launch.tcl" line 66)
ERROR hit any key to exit xterm
Could you please guide me how to set up cosimulation with Cadence Xcelium? cosimulation, cadence, hldverifier MATLAB Answers — New Questions
Link one excel file’s cell A1 on Sharepoint to another excel file as a link to navigate to that file
Hi
I want to link one excel file’s cell A1, which is on Sharepoint, to another excel files’s Sharepoint so as to navigate to it upon click.
HiI want to link one excel file’s cell A1, which is on Sharepoint, to another excel files’s Sharepoint so as to navigate to it upon click. Read More
Window 11 Keeps Asking For My PIN Even After I’ve Told It Not To
I had my computer set to ‘Never ask for PIN’ and it was doing just fine until I restarted it and it installed some updates. Now I am back to it asking for my PIN every time it wakes from sleep. I SO do not need this! I live alone and no one else is able to touch my computer. I am the administrator and only user for the computer.
I have opened the netplwiz file, and unchecked the box that says a user must sign in, but the problem persists.
Recently, I also gave Google access to my Microsoft files, and I am wondering if Google is responsible for the problem. I got a popup today saying Google was asking for my PIN. This was not when I was signing into the computer after waking it up, though, but a different situation.
Please help – have tried everything I can think of!
I had my computer set to ‘Never ask for PIN’ and it was doing just fine until I restarted it and it installed some updates. Now I am back to it asking for my PIN every time it wakes from sleep. I SO do not need this! I live alone and no one else is able to touch my computer. I am the administrator and only user for the computer.I have opened the netplwiz file, and unchecked the box that says a user must sign in, but the problem persists. Recently, I also gave Google access to my Microsoft files, and I am wondering if Google is responsible for the problem. I got a popup today saying Google was asking for my PIN. This was not when I was signing into the computer after waking it up, though, but a different situation. Please help – have tried everything I can think of! Read More
Why is element in the queue deleted even if the function throws an exception?
I write an Azure Function with a queue trigger and i want to send the data to the backend service if the backend service is avaiable and if not available then the element should still be in the queue.
My question is how can i achieve this?
my code and host.json looks like this ?
[Function(“QueueCancellations”)]
public async Task<IActionResult> QueueCancellation([QueueTrigger(“requests”, Connection = “ConnectionStrings:QUEUE_CONNECTION_STRING”)] string message)
{
try
{
using (var httpClient = new HttpClient())
{
var content = new StringContent(message, Encoding.UTF8, “application/json”);
var httpResponse = await httpClient.PostAsync(_configuration[“LOCAL_SERVICE_URL_CANCELL”], content);
if (httpResponse.IsSuccessStatusCode)
{
return new OkObjectResult(“Data sent to backend”);
}
else
{
return new BadRequestObjectResult(“Backend not available”);
}
}
}
catch (Exception ex) {
_logger.LogError(ex.Message);
return new BadRequestObjectResult(“Backend not available”);
}
}{
“version”: “2.0”,
“logging”: {
“applicationInsights”: {
“samplingSettings”: {
“isEnabled”: false,
“excludedTypes”: “Request”
},
“enableLiveMetricsFilters”: true
}
},
“logLevel”: {
“default”: “Information”,
“Host.Results”: “Information”,
“functions”: “Information”,
“Host.Aggregator”: “Information”
},
“extensions”: {
“queues”: {
“maxPollingInterval”: “00:00:02”,
“visibilityTimeout”: “00:00:30”,
“batchSize”: 16,
“maxDequeueCount”: 5,
“newBatchThreshold”: 8,
“messageEncoding”: “base64”
}
}
}
I write an Azure Function with a queue trigger and i want to send the data to the backend service if the backend service is avaiable and if not available then the element should still be in the queue. My question is how can i achieve this? my code and host.json looks like this ? [Function(“QueueCancellations”)]
public async Task<IActionResult> QueueCancellation([QueueTrigger(“requests”, Connection = “ConnectionStrings:QUEUE_CONNECTION_STRING”)] string message)
{
try
{
using (var httpClient = new HttpClient())
{
var content = new StringContent(message, Encoding.UTF8, “application/json”);
var httpResponse = await httpClient.PostAsync(_configuration[“LOCAL_SERVICE_URL_CANCELL”], content);
if (httpResponse.IsSuccessStatusCode)
{
return new OkObjectResult(“Data sent to backend”);
}
else
{
return new BadRequestObjectResult(“Backend not available”);
}
}
}
catch (Exception ex) {
_logger.LogError(ex.Message);
return new BadRequestObjectResult(“Backend not available”);
}
}{
“version”: “2.0”,
“logging”: {
“applicationInsights”: {
“samplingSettings”: {
“isEnabled”: false,
“excludedTypes”: “Request”
},
“enableLiveMetricsFilters”: true
}
},
“logLevel”: {
“default”: “Information”,
“Host.Results”: “Information”,
“functions”: “Information”,
“Host.Aggregator”: “Information”
},
“extensions”: {
“queues”: {
“maxPollingInterval”: “00:00:02”,
“visibilityTimeout”: “00:00:30”,
“batchSize”: 16,
“maxDequeueCount”: 5,
“newBatchThreshold”: 8,
“messageEncoding”: “base64”
}
}
recreated mailbox have user AD
How to recreated mailbox clean have you only AD user, restored only ADusers, exchange database is lost create new database new instalation.
Thanks for sugestions.
How to recreated mailbox clean have you only AD user, restored only ADusers, exchange database is lost create new database new instalation.Thanks for sugestions. Read More
Implement Rule-Based scaling in Logic App Standard via Logic App Consumption
Background
As we all know, in Logic App Standard with WS ASP, we follow the target-based scaling as Azure Function EP plan which mostly depends on job queue length not CPU/memory usage.
But in some situations, we could experience high CPU/memory usage without a long job queue which results to the scale controller cannot vote to increase backend instances as expected.
So this blog is introduce how to “simulate” rule based scaling feature for Logic App Standard with WS plan as Logic App Standard running in ASE via using Logic App Consumption (it can be out Azure products as well, eg: Azure Function).
Mechanism
Since we need to scale based on CPU/memory usage (can be different metrics as well) , so the first issue we need to resolved is to get metrics data as following:
For example, if we would like to get average CPU usage, we can go into CPU Percentage page and capture a HAR trace which can tell us that the API used in portal is like:
https://management.azure.com/subscriptions/[SubscriptionID]/resourceGroups/[ResourceGroup]/providers/Microsoft.Web/serverFarms/[ASP Name]/providers/microsoft.Insights/metrics?timespan=[StartTime]/[EndTime]&interval=PT1M&metricnames=CpuPercentage&aggregation=average&metricNamespace=microsoft.web%2Fserverfarms&autoadjusttimegrain=true&validatedimensions=false&api-version=2019-07-01
Once we have the CPU usage data, we can calculate to see whether the current value hit the threshold, if so, we can call management API to change always ready instance count (Web Apps – Update Configuration – REST API (Azure App Service) | Microsoft Learn).
Implementation Logic
1. Retrieve CPU usage via API for past 2 minutes and calculate average
2. Retrieve current Logic App backend instance count and set target instance count as current
3. Validate when average CPU usage hit scale in/out threshold (>50% or <10%)
If larger than scale out threshold, add 1 instance in target instance count if not hit maximum (20) instance count
If less than scale in threshold, remove 1 instance in target instance count if not hit minimum (1) instance count
4. Compare current and target value, if it is different, then send request to change always ready instance(s).
Sample Template
Demo can be found in Drac-Zhang/Rule_Based_Scaling_Template (github.com) which implemented the scaling feature based on CPU usage of past 2 minutes.
Meanwhile, for the Logic App Consumption, we need to enable Managed Identity, assign “Reader” and “Logic App Standard Contributor” role on resource group level for MI.
Known Issues
1. Since it is monitoring the CPU/memory usage on ASP level, so for the ASP, it only can have one Logic App, but this can be workaround via change metrics call to Logic App specific CPU/memory usage.
2. The Logic App Standard configuration API changes always ready instance, the maximum value is 20 as per current WS plan.
3. We have around 1 minute latency for retrieving metrics due to ingestion delay.
Microsoft Tech Community – Latest Blogs –Read More
Transform Application Development with .NET Aspire: Seamless Integration with JavaScript and Node.js
In the ever-evolving landscape of cloud application development, managing configurations, ensuring resilience, and maintaining seamless integration between various components can be quite challenging.
This is where .NET Aspire comes into play! A robust application development framework designed to simplify these complexities, allowing developers to focus on creating features rather than dealing with extensive configurations.
In this article, we will explore the core aspects of .NET Aspire, examining its benefits, the configuration process, and integration with JavaScript, as presented in an outstanding session at the recent .NET Aspire Developers Day by Chris Noring, Senior Developer Advocate at Microsoft.
.NET Aspire Developers Day
The latest .NET Aspire Developers Day, which took place on July 23, 2024, was a great event with lots of technical and practical sessions, featuring different programming languages and frameworks. The main goal of this online event was to show how easy it is to develop modern applications with the power of .NET Aspire!
No worries if you missed the event! Here’s the link to the recording so you can check out and learn more about .NET Aspire and how it can help you in different software development situations.
.NET Aspire Developers Day Online Event
So, what exactly is .NET Aspire? Let’s dive in and find out more!
Understanding .NET Aspire
.NET Aspire is a cloud-ready framework that helps you build distributed and production-ready applications. It’s got NuGet packages that make it easier to build apps that are made up of small, connected services, which are called microservices.
Purpose of .NET Aspire
.NET Aspire is all about making the development process easier, especially when it comes to building cloud-based apps. It’s got tools and patterns that make everything easier, from getting set up to running distributed applications. And also, .NET Aspire makes orchestration simple. It automatically connects projects and their dependencies, so you don’t have to worry about the technical details.
Simplified Orchestration
Orchestration in .NET Aspire is all about making your local development environment easier to use by automating the configuration and interconnection of multiple projects and their dependencies. It’s not meant to replace the kind of robust systems you’d use in production, like Kubernetes. What .NET Aspire does is provide abstractions that make it easier to set up services, find them, and configure containers.
Ready-to-Use Components
.NET Aspire also comes with ready-to-use components like Redis or PostgreSQL that you can add to your project with just a few lines of code. Plus, it’s got project templates and tools for Visual Studio, Visual Studio Code, and the .NET CLI, so it’s a breeze to create and manage your projects.
Usage Example
For instance, you can add a Redis container with just a few lines of code and set up the connection string automatically in the Frontend project.
var builder = DistributedApplication.CreateBuilder(args);
var cache = builder.AddRedis(“cache”);
builder.AddProject<Projects.MyFrontend>(“frontend”)
.WithReference(cache);
If you want to learn more about .NET Aspire, I suggest checking out the official documentation. It’s got all the details and examples you need to get started developing with .NET Aspire.
Access the official .NET Aspire documentation now: Official .NET Aspire Documentation
Getting Started with .NET Aspire with JavaScript
At the .NET Aspire Developers Day session, Chris Noring showed us an amazing integration between .NET Aspire and JavaScript. He demonstrated how we can use the power of .NET Aspire and the flexibility of JavaScript to create modern, distributed applications.
If you would like to watch Chris Noring’s full session, just click on the link below:
He started off by showing us how simple it is to get started with .NET Aspire. All you need to do is install a few things:
.NET 8
.NET Aspire Workload
OCI-Compatible with Docker ou Podman
Visual Studio Code ou Visual Studio
C# Dev Kit Extension
It’s really straightforward to set up a .NET Aspire project. You can use Visual Studio, Visual Studio Code, or just the terminal.
For instance, you can create a new project using the terminal with this command:
dotnet new aspire-starter
This command puts together a project structure that includes the main bits and pieces you need, like AppHost (the brains of the operation), ServiceDefaults, and a starter application.
Once you have got the project structured, the next thing to do is to run it. Before you get started, you’ll need to make sure that HTTPS is enabled, because .NET Aspire needs HTTPS to work.
To get HTTPS up and running, you can use this command:
dotnet dev-certs https –trust
To get the project up and running, just use the command:
dotnet run
When you run the AppHost project, you’ll see a dashboard with all the resources in your project, like APIs and Frontend services. This dashboard gives you a great overview of your application’s metrics, logs, and active requests, making it easier to monitor and debug your cloud application.
Chris Noring showed us all this at his session at .NET Aspire Developers Day. He made it look really easy and practical to start developing modern applications with .NET Aspire.
If you like, I recommend reading the tutorial: “Quickstart: Build your first .NET Aspire project” available in the official .NET Aspire documentation.
More on Orchestration with .NET Aspire
Let’s take a closer look at what Chris Noring shared in this part of the session.
Orchestrating distributed applications with .NET Aspire is all about setting up and linking the different parts that make up the application. The aspire-manifest.json file is key to this process. It shows how the services connect and configure within the application.
This automation makes life easier for developers, so they don’t have to manually configure each connection and dependency.
The Role of aspire-manifest.json
The aspire-manifest.json is a JSON file that’s automatically generated by .NET Aspire. It contains all the necessary info about the app’s resources and components.
It’s got all the details you need, like connection strings, environment variables, ports, and communication protocols. This manifest makes sure all the app’s services connect right and work together smoothly.
Let’s take a look at the example Chris Noring showed us in the session on how to configure a Redis cache and a Node.js Product API using the Program.cs file.
var cache = builder.AddRedis(“cache”);
var productApi = builder.AddNpmApp(“productapi”, “../NodeApi”, “watch”)
.WithReference(cache)
.WithHttpEndpoint(env: “PORT”)
.WithExternalHttpEndpoints()
.PublishAsDockerFile();
In this example, Redis is configured as a cache service, and the product API, developed in Node.js, is configured to use this cache. The WithReference(cache) method ensures that the product API can connect to Redis. The PublishAsDockerFile() method creates a Dockerfile for the application, allowing it to run in a container.
How Does the Manifest Reflect These Configurations?
Once you have run the code, .NET Aspire will create an aspire-manifest.json file that shows all the settings you’ve configured in the code. Chris goes over how the manifest shows the Redis and Product API configuration:
{
“productapi”: {
“type”: “dockerfile.v0”,
“path”: “../NodeApi/Dockerfile”,
“context”: “../NodeApi”,
“env”: {
“NODE_ENV”: “development”,
“ConnectionStrings__cache”: “{cache.connectionString}”,
“PORT”: “{productapi.bindings.http.port}”
},
“bindings”: {
“http”: {
“scheme”: “http”,
“protocol”: “tcp”,
“transport”: “http”,
“targetPort”: 8000,
“external”: true
}
}
}
}
In this part of the manifest, we can see that the Product API (productapi) is set up to use the Redis connection string (ConnectionStrings__cache), which is automatically generated and added to the application’s environment. On top of that, the manifest says that the Product API will be available via HTTP on port 8000.
How to Create or Update the Manifest?
If you want to generate or update the aspire-manifest.json file, you can use this command:
dotnet run –publisher manifest –output-path aspire-manifest.json
This command runs the application and generates the manifest, which is essential for deployment in production environments or testing during development.
Integrating JavaScript with .NET Aspire
.NET Aspire is flexible enough to work with JavaScript, so you can use it for both front-end and back-end development. This lets developers use popular JavaScript frameworks and libraries with .NET components, creating one unified development environment.
Front-End Example with Angular
Chris Noring showed us how to integrate .NET Aspire into a front-end project developed in Angular. You can make backend configuration and API connection easier by using environment variables. These are automatically generated and injected into the project.
Backend Configuration in Angular
The proxy.conf.js file is used to redirect API calls in the development environment to the right backend. The backend URLs, which can be different in each environment, are managed using environment variables. Here’s an example of how it’s set up:
module.exports = {
“/api”: {
target: process.env[“services__weatherapi__https__0”] || process.env[“services__weatherapi__http__0”],
secure: process.env[“NODE_ENV”] !== “development”,
pathRewrite: { “^/api”: “” },
},
};
In this example, we are setting the target based on the environment variables services__weatherapi__https__0 or services__weatherapi__http__0, which are automatically injected by .NET Aspire. This configuration makes sure that the Angular front end can connect to the backend service no matter what environment it’s in (development, test, or production).
Using HttpClient in Angular
In Angular, you can interact with the backend using the HttpClient service, as shown in the following example:
constructor(private http: HttpClient) {
this.http.get<WeatherForecast[]>(‘api/weatherforecast’).subscribe({
next: result => this.forecasts = result,
error: console.error
});
}
This snippet shows how the call to the api/weatherforecast API is automatically redirected to the correct backend, thanks to the configuration made in proxy.conf.js. This makes it easier for the Angular frontend and the backend to communicate, and it makes sure that the environment variables set up in the .NET Aspire manifest are used correctly.
Integration with Node.js and .NET Aspire
.NET Aspire makes it easy to manage .NET apps and also lets you easily add other technologies like Node.js. This flexibility lets you build distributed apps that combine different tech stacks in a really efficient way.
Orchestration in AppHost
In the orchestration performed in AppHost, .NET Aspire makes it easy for you to connect different components of your application, such as a Node.js Frontend and a Backend API.
In the second image, you can see how AppHost is configured in the code.
var cache = builder.AddRedis(“cache”);
var weatherapi = builder.AddProject<Projects.AspireWithNode_AspNetCoreApi>(“weatherapi”);
var frontend = builder.AddNpmApp(“frontend”, “../NodeFrontend”, “watch”)
.WithReference(weatherapi)
.WithReference(cache)
.WithHttpEndpoint(env: “PORT”)
.WithExternalHttpEndpoints()
.PublishAsDockerFile()
In this example, we’ve got Redis as the cache, the weatherapi for the weather forecast API, and the frontend is the Node.js application. The WithReference() function links everything together, so the frontend can access both Redis and the API.
Using PublishAsDockerFile() lets you package the frontend as a Docker container, which makes it easy to deploy in any environment.
In the Node.js Application…
In the example shown in the images, the Node.js app is set up to get the cache address and API URL straight from the Aspire project.
This is done through environment variables that are automatically generated based on the resources defined in the .NET Aspire manifest.
const cacheAddress = env[‘ConnectionStrings__cache’];
const apiServer = env[‘services__weatherapi__https__0’] ?? env[‘services__weatherapi__http__0’];
.NET Aspire automatically injects environment variables like ConnectionStrings__cache and services__weatherapi into the Node.js application’s runtime environment. These variables have the info needed for the app to connect to Redis and the weather forecast API correctly.
With this info, the app can easily access the cache and the API without having to hard-code URLs or connection strings. This makes it easier to maintain the code and also makes sure the application works correctly in different environments (development, test, production).
Example of Usage in an Express Route
Here’s an example of how this configuration is used in an Express route in a Node.js application:
app.get(‘/’, async (req, res) => {
let cachedForecasts = await cache.get(‘forecasts’);
if (cachedForecasts) {
res.render(‘index’, { forecasts: JSON.parse(cachedForecasts) });
return;
}
let response = await fetch(`${apiServer}/weatherforecast`);
let forecasts = await response.json();
await cache.set(‘forecasts’, JSON.stringify(forecasts));
res.render(‘index’, { forecasts });
});
The app first checks the Redis cache for weather forecasts. If the data is in the cache, it’s displayed right away. Otherwise, the app makes a request to the weather forecast API (apiServer), stores the results in the cache, and then displays them.
This logic really speeds up the app and makes it more efficient, so data is retrieved from the cache as quickly as possible.
Wrapping Up
.NET Aspire is a great tool for building modern apps, especially those that use cloud-based, distributed systems. It gets rid of a lot of the complexity that comes with orchestration, service discovery, and environment configuration, so developers can focus more on writing business logic and less on managing infrastructure.
It’s also great that it works seamlessly with JavaScript and Node.js. This makes it a really useful tool for building robust, scalable apps that can work across different tech stacks.
If you’re looking to make your development workflow easier while making sure your apps are ready for the real world, .NET Aspire is worth checking out.
Don’t miss the opportunity to dive deeper into .NET Aspire’s capabilities. Watch the full session by Chris Noring at .NET Aspire Developers Day to see these concepts in action and learn more about how you can leverage this powerful framework in your projects.
Additional Resources
If you’re looking to keep learning and getting more proficient in .NET Aspire, here are some additional resources to help you out:
Documentação Oficial – .NET Aspire
Orchestrate Node.js apps in .NET Aspire
Code Sample: .NET Aspire with Angular, React, and Vue
Code Sample: .NET Aspire + Node.js
Curso Grátis: Criar aplicativos distribuídos com o .NET Aspire
Video series: Welcome to .NET Aspire
I hope this article has been helpful and inspiring for you. If you have any questions or suggestions, please feel free to share them in the comments below. I’m here to help and support you as you learn and grow professionally.
Until next time! Keep learning, creating, and sharing!
Microsoft Tech Community – Latest Blogs –Read More
Building Power Apps Canvas App with Multimedia Integration in SharePoint – Audio Player
In this article, I’ll demonstrate how to build a Power Apps canvas application from within SharePoint that integrates a SharePoint list backend for storing and retrieving images and audio files. The app that you create will have labels, icons, navigation, image containers, and other elements. You’ll use a layout template and customize the built-in audio control to have a distinct appearance and behavior.
By following the step-by-step instructions in this article, you will be able to create a powerful and fully functional canvas application that’s integrated with SharePoint. By using SharePoint lists for storing and retrieving images and audio files, you can ensure that your app is efficient and easy to manage. The customization options available through Power Apps make it possible to create a unique and visually appealing user interface that’s tailored to your needs. Additionally, you can customize the built-in audio control for a distinct appearance and behavior that will enhance the user experience. With the help of this article, you can create an application that’s sure to impress and provide a valuable tool for your organization.
To begin, you need to have:
A Microsoft 365 license or Microsoft 365 developer account.
A basic understanding about SharePoint (site, list and document library) and Power Apps development.
The high-level sequence of tasks is as follows:
Create elements in SharePoint.
Design and configure the application interface and controls in Power Apps.
Creating a Track List, Picture Library, and Document Library in SharePoint
First, you’ll build the elements in SharePoint.
Step 1: Build the Track List in SharePoint List
In the track list, you’ll create columns for title, artist, artwork, and audio file, where you will add associated data.
Open SharePoint, and then create a new SharePoint site or use an existing site.
Create a new list with the following sample columns (Title, Artist, Artwork, AudioFile), as shown:
The list columns have the following associated data types:
Title – Text
Artist – Text
Artwork – Image
Audio file link – Multiple lines of text
You’ve now created the track list with data. Next, you’ll create the picture library.
Step 2: Create the SharePoint Picture Library to Hold Your Images
Create a picture library to hold all your audio image covers.
To do this, navigate to the site where you want to create the picture library. Select Settings, and then select Add an app. If you don’t see Add an app, select Site contents, and then select Add an app, as shown:
On the Your Apps page, enter Picture into the search box, or select Picture Library. Then in New, select Advanced Options. In Name, enter a name for the library. The library name is required. Enter a description of the library (this is optional), and then choose whether to create a version each time you edit a file in the picture library.
Select Create. The name of the picture library appears under Recent in the Quick Launch panel.
Step 3: Create a SharePoint Document Library to Hold Audio Files
This is the final development stage in SharePoint, where you will create a placeholder for storing image files in SharePoint. In your SharePoint site, select New > Document library > Create (after naming it), as shown in the following video demo.
Video 1: Demonstrates how to create a SharePoint document library.
Developing the Power Apps Application
You can use Power Apps to build custom apps for business scenarios such as data collection, data management, and data visualization, without having to write complex code.
At this point, you’ll develop the Power Apps application UI to feed on the data that you created in SharePoint. Although you could visit make.powerapps.com to start a new blank canvas app and connect to SharePoint to build the app for you, you’ve now integrated this data into a SharePoint list.
Open your track list.
Select Integrate > Power Apps > Create an app (figure 5).
Enter a name for your app, and then select Create.
Designing the Application Controls
Once you’ve generated the app from your SharePoint list, you will construct the interface and configure the controls within the app.
Step 1: Configure and Design the Track List Screen – First Screen
Modify the background properties of the application, and then update the style of the artist image cover to have a circular shape.
To change the background image in PowerApps:
In Power Apps Studio, select the screen.
In the right-hand pane, go to Properties.
From the background image dropdown options, select Add an image file.
Browse to the file you want to use, and then select Open.
To make the image control have a circular shape:
Select the image control that has the artist album cover.
In the right-hand pane, select the Properties tab, if not selected.
Change the Border radius properties to 360.
The aspect ratio of the image will be preserved. If the image isn’t square, it may appear distorted in the circular shape.
Step 2: Build the Play Screen – Second Screen
Now you’ll create the play screen, which includes various controls and displays whatever track is playing. After you’ve built the screen, establish a connection between the first screen (the track list screen) and the play screen. Once you’ve connected the screens, you can navigate, when a track is selected.
To do this:
Go to the first screen (track list screen).
Select the first track, and update its OnSelect property to Navigate(PlayScreen,ScreenTransition.CoverRight). This will navigate the screen to the play screen when you select it.
Build the second screen (play screen) by adding the image control from the selected track:
From the Insert tab, add an image control.
To set the image in your gallery image from the SharePoint database, you can update the image function by using the provided sample code: SongGallery.Selected.Artwork. This is based on what I have. Yours might be a different name.
The Selected property in PowerApps references a currently selected item in a gallery, form, or data table. The SongGallery.Selected refers to a currently selected item in the SongGallery gallery.
The Artwork in this formula is the name of the field that contains the image of the artwork in the data source used by the SongGallery gallery. This field is the column in a SharePoint list that has the track title.
Therefore, SongGallery.Selected.Artwork refers to the image of the artwork that’s associated with the currently selected item in the SongGallery gallery. It enables you to display the artwork in an image control or perform other actions with it.
After making all the necessary editing, the screenshot below shows how your image should look.
Add the Text Labels
On the Insert tab, add two text labels.
Just like the previous updates that you made on the image preview of the selected track, you will apply similar code here to view the artist name and the track title.
Set the text function of one of the text labels to SongGallery.Selected.Title. The Title in this formula is the name of the field that contains the title of the item in the data source that the SongGallery gallery uses. This field is the column in a SharePoint list that has the track title.
Set the second text label function to SongGallery.Selected.Artist.
Add the Audio Control
This is the control that will play the selected track from the previous screen.
From the Insert tab under Media, add an audio Control.
Update its Media function with First(SongGallery.Selected.Attachments).AbsoluteUri. This will enable the audio player to play the selected track.
The following is similar to earlier code, except:
The First function is used to return the first attachment in the attachments collection associated with the currently selected item in the SongGallery gallery.
The AbsoluteUri is a property of the attachment object that returns the URL of the attachment. First(SongGallery.Selected.Attachments).AbsoluteUri is a reference to the URL of the first attachment associated with the currently selected item in the SongGallery gallery, which enables you to display or download the attachment or perform other actions with it.
Change the visibility of the control to False. This is to hide it from showing, since new controls will be created.
Enter the variable varStarPos into the StartTime function.
Enter varPlayinto the Start function.
Add the Slide Bar, Forward, Backward, Play, and Pause Buttons
You will need to add a slide bar and four image controls. You can find these on the Insert tab. After you’ve added them, update each control as follows.
Update the image of the button by browsing an image from your system through the image property of the image control on the Properties tab in the right-hand pane.
Slide Bar:
You can use the slide bar to fast-forward or rewind the currently playing audio.
Update the OnSelect function to Set(varStarPos, Self.Value). Here you’re setting a variable called varStarPos that you used in the StartTime function of the audio control.
Update the Max function to Audio1.Duration. Audio1 is the default name of the audio control.
Set the Default function to varStarPos. This is the default position of the slide bar.
Forward Button:
Update the OnSelect function of the button with the following:
Set(varStarPos,Audio1.Time – 2);
Set(varPlay, false);
Set(varPlay, true);
Backward Button:
Update the OnSelect function of the button with the following:
Set(varStarPos,Audio1.Time + 2);
Set(varPlay, false);
Set(varPlay, true);
Play Button:
Update with the following:
Update the OnSelect function to Set(varPlay,true).
Set its Visible function to !varPlay. The button should not be visible when the audio is playing.
Pause Button:
Update with the following:
Update the OnSelect function to Set(varPlay,false).
Set its Visible function to varPlay. The button should be visible when the audio is playing.
Place the play button on the pause button. The visibility function will make one appear and disappear based on the audio.
The resulting app interface for the play screen (second screen).
Step 3: Build the Add Track Screen – Third Screen
Because you created the app from a SharePoint list, the screen has already been generated for you. You can update the interface by using the available properties.
Step 4: View the Completed App Screens
And there you have it! Don’t forget to add a sound bar .gif file on the screen to display.
Summary
We learned how to build an audio player in Power Apps with a track list integrated from SharePoint. We started by creating a track list, picture library, and document library in SharePoint, and then used Power Apps to design and configure the application controls.
With this audio player, users can easily browse through the track list, select a track to play, and even pause or skip tracks. The user interface is sleek and intuitive, allowing for a seamless experience for users of all skill levels. The integration of SharePoint and Power Apps makes it possible to store and manage the audio files, in addition to any related pictures and documents, all in one central location. This not only improves organization and accessibility but also ensures that the latest versions of files are always available to users. The possibilities for customization and expansion of this audio player are endless, making it a valuable addition to anyone’s digital toolkit.
Resources
Microsoft Power Apps documentation
Microsoft Tech Community – Latest Blogs –Read More
MobileNetV1 & MobileNetV3 matlab code for detecting wafer map defect patterns
Give MobileNetV1 & MobileNetV3 matlab code for identifying defect wafer map patternsGive MobileNetV1 & MobileNetV3 matlab code for identifying defect wafer map patterns Give MobileNetV1 & MobileNetV3 matlab code for identifying defect wafer map patterns mobilenetv1 mobilenetv3 MATLAB Answers — New Questions
how can i show my data on thingsview free app?
i used the read api key to upload my data but it is showing invalid api key. kindly answeri used the read api key to upload my data but it is showing invalid api key. kindly answer i used the read api key to upload my data but it is showing invalid api key. kindly answer 🙂 MATLAB Answers — New Questions
Obtain information after text pattern
Dear all
I am dealing with files of extension .mat (not related to MatLab) like the one attached. I would like to be able to save in a variable the numerical value that appears after the text pattern "material[x]:damping-constant = 1.0" (so, in this case, 1). As you can see, here x can take values from 1 to 8, so it would be useful for me to have an array where each index position stores the the corresponding value.
Any ideas?Dear all
I am dealing with files of extension .mat (not related to MatLab) like the one attached. I would like to be able to save in a variable the numerical value that appears after the text pattern "material[x]:damping-constant = 1.0" (so, in this case, 1). As you can see, here x can take values from 1 to 8, so it would be useful for me to have an array where each index position stores the the corresponding value.
Any ideas? Dear all
I am dealing with files of extension .mat (not related to MatLab) like the one attached. I would like to be able to save in a variable the numerical value that appears after the text pattern "material[x]:damping-constant = 1.0" (so, in this case, 1). As you can see, here x can take values from 1 to 8, so it would be useful for me to have an array where each index position stores the the corresponding value.
Any ideas? information after text pattern MATLAB Answers — New Questions
Generated link to OneDrive file generates “404 NOT FOUND” page
Hi, I’m having an issue with OneDrive – if I attempt to share a file and generate a link that can be opened by anyone and then send that link out, anyone who tries to open the link receives a “404 NOT FOUND” page. This happens for any file in my OneDrive account. I can simulate this if I copy the link and then try to open the link in an incognito or private browser where I am not signed into my OneDrive account. The permissions on the file are set to “anyone with the link can edit”. Can anyone help?
Hi, I’m having an issue with OneDrive – if I attempt to share a file and generate a link that can be opened by anyone and then send that link out, anyone who tries to open the link receives a “404 NOT FOUND” page. This happens for any file in my OneDrive account. I can simulate this if I copy the link and then try to open the link in an incognito or private browser where I am not signed into my OneDrive account. The permissions on the file are set to “anyone with the link can edit”. Can anyone help? Read More
how can i load multiple/all audio files(.wav) in matlab ? all files have different names.
i have a folder containing a number of audio files. i want to load them in a loop so that each audio signals can undergo some operations that i intend to perform.i have a folder containing a number of audio files. i want to load them in a loop so that each audio signals can undergo some operations that i intend to perform. i have a folder containing a number of audio files. i want to load them in a loop so that each audio signals can undergo some operations that i intend to perform. audio signal, loading multiple files, matlab MATLAB Answers — New Questions
Live script: Error Loading
Hi
I have created a matlab livescript file, and saved it as .mlx . When i now try to open the file i get the error:
Error Loading "path to file"
I have tried opening it in a text editor, but it is just empty, though the file is 219 kb.
What can I do?Hi
I have created a matlab livescript file, and saved it as .mlx . When i now try to open the file i get the error:
Error Loading "path to file"
I have tried opening it in a text editor, but it is just empty, though the file is 219 kb.
What can I do? Hi
I have created a matlab livescript file, and saved it as .mlx . When i now try to open the file i get the error:
Error Loading "path to file"
I have tried opening it in a text editor, but it is just empty, though the file is 219 kb.
What can I do? error, live script MATLAB Answers — New Questions
Fully Utilize SharePoint as Document Management
Hi All
I just learned about Document Sets, and got some insights for document handling
I have some plan for development:
Using document sets, we will manage all documents related to end-to-end business from quotations to invoice. Each Purchase Order Doc Set will have their related documents (invoice, approval, payment remittance, etc).
But how we can create a list that contains all the invoice, or approval, or payment (we can store or link them on sharepoint list, and then when we click, we will redirected to the related document set)
In short,
-We have document sets (contains invoice, PO, approval, remittance)
-we would like to create view based on all PO (and other properties also) and when we click, we will redirected to the related doc set
Any suggestions of article or keywords?
Hi AllI just learned about Document Sets, and got some insights for document handling I have some plan for development:Using document sets, we will manage all documents related to end-to-end business from quotations to invoice. Each Purchase Order Doc Set will have their related documents (invoice, approval, payment remittance, etc). But how we can create a list that contains all the invoice, or approval, or payment (we can store or link them on sharepoint list, and then when we click, we will redirected to the related document set)In short, -We have document sets (contains invoice, PO, approval, remittance)-we would like to create view based on all PO (and other properties also) and when we click, we will redirected to the related doc setAny suggestions of article or keywords? Read More
How to modify my MATLAB code to train a neural network with images from different directions?
I am working on a project to estimate the weight of pears using their RGB and Depth images in MATLAB. Initially, I used only front-facing images for training, but now I want to improve the accuracy by including images taken from approximately 90 degrees to the left or right of the pear, in addition to the Depth images.
I tried modifying the code to accommodate these new images, but I’m facing some issues.
I would like to change the following points.
・The pear image uses the front, an image moved approximately 90 degrees to the left or right from the front, and a Depth image.
・I want to put all my data in a datastore.
Currently, the code listed is the code that I have modified, but in this state an error occurs and it is impossible to predict.Please lend me your strength.
cd ‘RGB front file path’
folder_name = pwd;
XImageTrainFront = imageDatastore(folder_name);
cd ‘RGB side file path’
folder_name = pwd;
XImageTrainSide = imageDatastore(folder_name);
cd ‘Depth file path’
folder_name = pwd;
XDispTrain = imageDatastore(folder_name);
YTrain = [ data ]; % weight labels
YTrain = arrayDatastore(YTrain);
% Combine datasets
ds = combine(XImageTrainFront, XImageTrainSide, XDispTrain, YTrain);
dsrand = shuffle(ds);
dsVali = partition(dsrand, 10, 1);
dsTest = partition(dsrand, 10, 2);
ds1=partition(dsrand,10,3);
ds2=partition(dsrand,10,4);
ds3=partition(dsrand,10,5);
ds4=partition(dsrand,10,6);
ds5=partition(dsrand,10,7);
ds6=partition(dsrand,10,8);
ds7=partition(dsrand,10,9);
ds8=partition(dsrand,10,10);
dsTrain=combine(ds1,ds2,ds3,ds4,ds5,ds6,ds7,ds8,ReadOrder="sequential");
YTest = readall(dsTest);
YTest = YTest(:,4);
YTest = cell2mat(YTest);
cd ‘path to googlenet300400_multiple.mat’
load googlenet300400_multiple.mat
miniBatchSize = 16;
options = trainingOptions(‘sgdm’, …
‘MiniBatchSize’, miniBatchSize, …
‘MaxEpochs’, 300, …
‘InitialLearnRate’, 1e-7, …
‘LearnRateSchedule’, ‘piecewise’, …
‘LearnRateDropFactor’, 0.1, …
‘LearnRateDropPeriod’, 30, …
‘Shuffle’, ‘every-epoch’, …
‘ValidationData’, dsVali, …
‘ValidationFrequency’, 50, …
‘Plots’, ‘training-progress’, …
‘Verbose’, true);
net = trainNetwork(dsTrain, lgraph_2, options);
YPredicted = predict(net, dsTest);
predictionError = YTest – YPredicted;
squares = predictionError.^2;
rmse = sqrt(mean(squares));
figure
scatter(YPredicted, YTest, ‘+’);
xlabel("Predicted Value");
ylabel("True Value");
hold on;
plot([100 550], [100 550], ‘r–‘);
Error message that occurred during modification.
Please respond to the following error. Error using: trainNetwork (line 191)
Invalid training data. The output size of the last layer ([1 1 1]) is the response size ([1 1 360000])
does not match.
Error: newcord2 (line 93)
net = trainNetwork(dsTrain, lgraph_2, options);
Error using: nnet.cnn.LayerGraph/replaceLayer (line 300)
Layer ‘finalLayerName’ does not exist.
Error: newcord2 (line 85)
lgraph_2 = replaceLayer(lgraph_2, ‘finalLayerName’, finalLayer);I am working on a project to estimate the weight of pears using their RGB and Depth images in MATLAB. Initially, I used only front-facing images for training, but now I want to improve the accuracy by including images taken from approximately 90 degrees to the left or right of the pear, in addition to the Depth images.
I tried modifying the code to accommodate these new images, but I’m facing some issues.
I would like to change the following points.
・The pear image uses the front, an image moved approximately 90 degrees to the left or right from the front, and a Depth image.
・I want to put all my data in a datastore.
Currently, the code listed is the code that I have modified, but in this state an error occurs and it is impossible to predict.Please lend me your strength.
cd ‘RGB front file path’
folder_name = pwd;
XImageTrainFront = imageDatastore(folder_name);
cd ‘RGB side file path’
folder_name = pwd;
XImageTrainSide = imageDatastore(folder_name);
cd ‘Depth file path’
folder_name = pwd;
XDispTrain = imageDatastore(folder_name);
YTrain = [ data ]; % weight labels
YTrain = arrayDatastore(YTrain);
% Combine datasets
ds = combine(XImageTrainFront, XImageTrainSide, XDispTrain, YTrain);
dsrand = shuffle(ds);
dsVali = partition(dsrand, 10, 1);
dsTest = partition(dsrand, 10, 2);
ds1=partition(dsrand,10,3);
ds2=partition(dsrand,10,4);
ds3=partition(dsrand,10,5);
ds4=partition(dsrand,10,6);
ds5=partition(dsrand,10,7);
ds6=partition(dsrand,10,8);
ds7=partition(dsrand,10,9);
ds8=partition(dsrand,10,10);
dsTrain=combine(ds1,ds2,ds3,ds4,ds5,ds6,ds7,ds8,ReadOrder="sequential");
YTest = readall(dsTest);
YTest = YTest(:,4);
YTest = cell2mat(YTest);
cd ‘path to googlenet300400_multiple.mat’
load googlenet300400_multiple.mat
miniBatchSize = 16;
options = trainingOptions(‘sgdm’, …
‘MiniBatchSize’, miniBatchSize, …
‘MaxEpochs’, 300, …
‘InitialLearnRate’, 1e-7, …
‘LearnRateSchedule’, ‘piecewise’, …
‘LearnRateDropFactor’, 0.1, …
‘LearnRateDropPeriod’, 30, …
‘Shuffle’, ‘every-epoch’, …
‘ValidationData’, dsVali, …
‘ValidationFrequency’, 50, …
‘Plots’, ‘training-progress’, …
‘Verbose’, true);
net = trainNetwork(dsTrain, lgraph_2, options);
YPredicted = predict(net, dsTest);
predictionError = YTest – YPredicted;
squares = predictionError.^2;
rmse = sqrt(mean(squares));
figure
scatter(YPredicted, YTest, ‘+’);
xlabel("Predicted Value");
ylabel("True Value");
hold on;
plot([100 550], [100 550], ‘r–‘);
Error message that occurred during modification.
Please respond to the following error. Error using: trainNetwork (line 191)
Invalid training data. The output size of the last layer ([1 1 1]) is the response size ([1 1 360000])
does not match.
Error: newcord2 (line 93)
net = trainNetwork(dsTrain, lgraph_2, options);
Error using: nnet.cnn.LayerGraph/replaceLayer (line 300)
Layer ‘finalLayerName’ does not exist.
Error: newcord2 (line 85)
lgraph_2 = replaceLayer(lgraph_2, ‘finalLayerName’, finalLayer); I am working on a project to estimate the weight of pears using their RGB and Depth images in MATLAB. Initially, I used only front-facing images for training, but now I want to improve the accuracy by including images taken from approximately 90 degrees to the left or right of the pear, in addition to the Depth images.
I tried modifying the code to accommodate these new images, but I’m facing some issues.
I would like to change the following points.
・The pear image uses the front, an image moved approximately 90 degrees to the left or right from the front, and a Depth image.
・I want to put all my data in a datastore.
Currently, the code listed is the code that I have modified, but in this state an error occurs and it is impossible to predict.Please lend me your strength.
cd ‘RGB front file path’
folder_name = pwd;
XImageTrainFront = imageDatastore(folder_name);
cd ‘RGB side file path’
folder_name = pwd;
XImageTrainSide = imageDatastore(folder_name);
cd ‘Depth file path’
folder_name = pwd;
XDispTrain = imageDatastore(folder_name);
YTrain = [ data ]; % weight labels
YTrain = arrayDatastore(YTrain);
% Combine datasets
ds = combine(XImageTrainFront, XImageTrainSide, XDispTrain, YTrain);
dsrand = shuffle(ds);
dsVali = partition(dsrand, 10, 1);
dsTest = partition(dsrand, 10, 2);
ds1=partition(dsrand,10,3);
ds2=partition(dsrand,10,4);
ds3=partition(dsrand,10,5);
ds4=partition(dsrand,10,6);
ds5=partition(dsrand,10,7);
ds6=partition(dsrand,10,8);
ds7=partition(dsrand,10,9);
ds8=partition(dsrand,10,10);
dsTrain=combine(ds1,ds2,ds3,ds4,ds5,ds6,ds7,ds8,ReadOrder="sequential");
YTest = readall(dsTest);
YTest = YTest(:,4);
YTest = cell2mat(YTest);
cd ‘path to googlenet300400_multiple.mat’
load googlenet300400_multiple.mat
miniBatchSize = 16;
options = trainingOptions(‘sgdm’, …
‘MiniBatchSize’, miniBatchSize, …
‘MaxEpochs’, 300, …
‘InitialLearnRate’, 1e-7, …
‘LearnRateSchedule’, ‘piecewise’, …
‘LearnRateDropFactor’, 0.1, …
‘LearnRateDropPeriod’, 30, …
‘Shuffle’, ‘every-epoch’, …
‘ValidationData’, dsVali, …
‘ValidationFrequency’, 50, …
‘Plots’, ‘training-progress’, …
‘Verbose’, true);
net = trainNetwork(dsTrain, lgraph_2, options);
YPredicted = predict(net, dsTest);
predictionError = YTest – YPredicted;
squares = predictionError.^2;
rmse = sqrt(mean(squares));
figure
scatter(YPredicted, YTest, ‘+’);
xlabel("Predicted Value");
ylabel("True Value");
hold on;
plot([100 550], [100 550], ‘r–‘);
Error message that occurred during modification.
Please respond to the following error. Error using: trainNetwork (line 191)
Invalid training data. The output size of the last layer ([1 1 1]) is the response size ([1 1 360000])
does not match.
Error: newcord2 (line 93)
net = trainNetwork(dsTrain, lgraph_2, options);
Error using: nnet.cnn.LayerGraph/replaceLayer (line 300)
Layer ‘finalLayerName’ does not exist.
Error: newcord2 (line 85)
lgraph_2 = replaceLayer(lgraph_2, ‘finalLayerName’, finalLayer); prediction, multi-input network, 予測, 多入力ネットワーク MATLAB Answers — New Questions
Trouble with for loop to determine the discrete wavelet transform (DWT)
Hi, I need assistance plotting the approximation (A) and detail (D) coefficients separately for all sheets in an Excel file over a given time frame. Currently, the existing code generates the transform only for the final sheet (Sheet 8). I am uncertain how to modify the code to plot A1 and D1 separately for all sheets. I would be grateful for any guidance you may provide. Thank you for your time and help.
clc
close all
clear all
filename = ‘s.xlsx’;
% Get the sheet names
[~, sheetNames] = xlsfinfo(filename);
for i = 1:numel(sheetNames)
% Read the data from the sheet
data = xlsread(filename, i);
% Extract the time and signal columns
t(:,i) = data(:, 1);
sig (:,i) = data(:, 2);
N = length(sig);
dt = t(3) – t(2); % sampling time
fs = 1/dt; % freq
signal = sig (:,i);
wname = ‘bior6.8’;
% Discrete Wavelet, DWT
[CA1,CD1] = dwt(signal,wname);
A1 = idwt(CA1,[],wname,N);
D1 = idwt([],CD1,wname,N);
t (:,i) = linspace(0,N,N)*(1/fs);
subplot(1,2,1);
plot (t,A1,’k’,’LineWidth’,1.5);
title(sprintf(‘Approximation for sheet%d’, i));
set(gca,’fontname’,’Times New Roman’,’FontSize’,10)
xlabel(‘Time (secs)’)
ylabel(‘Amplitude’)
grid on
subplot(1,2,2);
plot (t,D1,’k’,’LineWidth’,1.5);
title(sprintf(‘Detail for sheet%d’, i));
set(gca,’fontname’,’Times New Roman’,’FontSize’,10)
xlabel(‘Time (secs)’)
ylabel(‘Amplitude’)
grid on
endHi, I need assistance plotting the approximation (A) and detail (D) coefficients separately for all sheets in an Excel file over a given time frame. Currently, the existing code generates the transform only for the final sheet (Sheet 8). I am uncertain how to modify the code to plot A1 and D1 separately for all sheets. I would be grateful for any guidance you may provide. Thank you for your time and help.
clc
close all
clear all
filename = ‘s.xlsx’;
% Get the sheet names
[~, sheetNames] = xlsfinfo(filename);
for i = 1:numel(sheetNames)
% Read the data from the sheet
data = xlsread(filename, i);
% Extract the time and signal columns
t(:,i) = data(:, 1);
sig (:,i) = data(:, 2);
N = length(sig);
dt = t(3) – t(2); % sampling time
fs = 1/dt; % freq
signal = sig (:,i);
wname = ‘bior6.8’;
% Discrete Wavelet, DWT
[CA1,CD1] = dwt(signal,wname);
A1 = idwt(CA1,[],wname,N);
D1 = idwt([],CD1,wname,N);
t (:,i) = linspace(0,N,N)*(1/fs);
subplot(1,2,1);
plot (t,A1,’k’,’LineWidth’,1.5);
title(sprintf(‘Approximation for sheet%d’, i));
set(gca,’fontname’,’Times New Roman’,’FontSize’,10)
xlabel(‘Time (secs)’)
ylabel(‘Amplitude’)
grid on
subplot(1,2,2);
plot (t,D1,’k’,’LineWidth’,1.5);
title(sprintf(‘Detail for sheet%d’, i));
set(gca,’fontname’,’Times New Roman’,’FontSize’,10)
xlabel(‘Time (secs)’)
ylabel(‘Amplitude’)
grid on
end Hi, I need assistance plotting the approximation (A) and detail (D) coefficients separately for all sheets in an Excel file over a given time frame. Currently, the existing code generates the transform only for the final sheet (Sheet 8). I am uncertain how to modify the code to plot A1 and D1 separately for all sheets. I would be grateful for any guidance you may provide. Thank you for your time and help.
clc
close all
clear all
filename = ‘s.xlsx’;
% Get the sheet names
[~, sheetNames] = xlsfinfo(filename);
for i = 1:numel(sheetNames)
% Read the data from the sheet
data = xlsread(filename, i);
% Extract the time and signal columns
t(:,i) = data(:, 1);
sig (:,i) = data(:, 2);
N = length(sig);
dt = t(3) – t(2); % sampling time
fs = 1/dt; % freq
signal = sig (:,i);
wname = ‘bior6.8’;
% Discrete Wavelet, DWT
[CA1,CD1] = dwt(signal,wname);
A1 = idwt(CA1,[],wname,N);
D1 = idwt([],CD1,wname,N);
t (:,i) = linspace(0,N,N)*(1/fs);
subplot(1,2,1);
plot (t,A1,’k’,’LineWidth’,1.5);
title(sprintf(‘Approximation for sheet%d’, i));
set(gca,’fontname’,’Times New Roman’,’FontSize’,10)
xlabel(‘Time (secs)’)
ylabel(‘Amplitude’)
grid on
subplot(1,2,2);
plot (t,D1,’k’,’LineWidth’,1.5);
title(sprintf(‘Detail for sheet%d’, i));
set(gca,’fontname’,’Times New Roman’,’FontSize’,10)
xlabel(‘Time (secs)’)
ylabel(‘Amplitude’)
grid on
end dwt, plot, for loop MATLAB Answers — New Questions