Category: Microsoft
Category Archives: Microsoft
Azure Join Failing at Device Setup
I build 2 VM’s, Windows 10 and Windows 11. Both install fine from Intune. The laptop fails at Device Setup. IME log shows:
[Location Service] Failed to Get Endpoint From LocationServiceServiceAddressesController with url https://manage.microsoft.com/RestUserAuthLocationService/RestUserAuthLocationService/Certificate/ServiceAddresses, thumbprint BCAD588F2610428A48540BC92C94F50DE008C6AF,True, WebException status NameResolutionFailure message The remote name could not be resolved: ‘manage.microsoft.com’ full System.Net.WebException: The remote name could not be resolved: ‘manage.microsoft.com’
at System.Net.HttpWebRequest.GetResponse()
at Microsoft.Management.Services.IntuneWindowsAgent.AgentCommon.DiscoveryService.GetEndpointFromLocationServiceServiceAddressesController(X509Certificate2 deviceCertificate, Guid deviceId)
Call configures and profiles are the same as the VM’s.
Any suggestions?
I build 2 VM’s, Windows 10 and Windows 11. Both install fine from Intune. The laptop fails at Device Setup. IME log shows:[Location Service] Failed to Get Endpoint From LocationServiceServiceAddressesController with url https://manage.microsoft.com/RestUserAuthLocationService/RestUserAuthLocationService/Certificate/ServiceAddresses, thumbprint BCAD588F2610428A48540BC92C94F50DE008C6AF,True, WebException status NameResolutionFailure message The remote name could not be resolved: ‘manage.microsoft.com’ full System.Net.WebException: The remote name could not be resolved: ‘manage.microsoft.com’at System.Net.HttpWebRequest.GetResponse()at Microsoft.Management.Services.IntuneWindowsAgent.AgentCommon.DiscoveryService.GetEndpointFromLocationServiceServiceAddressesController(X509Certificate2 deviceCertificate, Guid deviceId) Call configures and profiles are the same as the VM’s.Any suggestions? Read More
error when upgrading windows server 2012 to 2016
I’m trying to upgrade a Dell server from 2012 to 2016, basic installation, there’s only an accounting program running, no AD or anything.
The first time I ran it I received an error message that an old accounting program, no longer used, was not compatible with the new version, I removed it and received a message that logmein was not compatible, I also removed it, now a message appears reporting an incompatible program, but it doesn’t say what it is.
Any tips on how to resolve this?
I’m trying to upgrade a Dell server from 2012 to 2016, basic installation, there’s only an accounting program running, no AD or anything.The first time I ran it I received an error message that an old accounting program, no longer used, was not compatible with the new version, I removed it and received a message that logmein was not compatible, I also removed it, now a message appears reporting an incompatible program, but it doesn’t say what it is.Any tips on how to resolve this? Read More
Is Microsoft Planning an API Suite Including Every Service They Have? If not why shouldn`t they?
Is Microsoft Planning an API Suite Including Every Service They Have? If not why shouldn`t they?
Like Rapid API but includes every possible service or product functions can be called via API?
Wouldn`t that be sweet?
Is Microsoft Planning an API Suite Including Every Service They Have? If not why shouldn`t they?Like Rapid API but includes every possible service or product functions can be called via API?Wouldn`t that be sweet? Read More
WS 2025 Public Preview
I am currently enjoying the public preview of 2025. In particular, the gui version. What I find that is of particular interest to me is the support for the latest hardware. The main thing that separates Microsoft from all the other OS’s is software support for a variety of hardware configuations. If that did not exist then we would have a monopoly (Apple hardware).
I am currently enjoying the public preview of 2025. In particular, the gui version. What I find that is of particular interest to me is the support for the latest hardware. The main thing that separates Microsoft from all the other OS’s is software support for a variety of hardware configuations. If that did not exist then we would have a monopoly (Apple hardware). Read More
Microsoft Table
I have two questions on Word table:
1. Is there a way to put the second duplicate row as blank? Can I use Word existing functions to set this up? If not, would this be a VBA macros question. The Word document is a template and will populate data when generating a report.
Company
Depart
Name
ABC
Accounting
Sherman
ABC
Accounting
David
ABC
Sales
John
ABC
Sales
Mary
Company
Depart
Name
ABC
Accounting
Sherman
David
ABC
Sales
John
Mary
2. How to set columns in a table length fixed? So when data is populated the columns will not shrink or expand the width?
Thanks
I have two questions on Word table: 1. Is there a way to put the second duplicate row as blank? Can I use Word existing functions to set this up? If not, would this be a VBA macros question. The Word document is a template and will populate data when generating a report.CompanyDepartNameABCAccountingShermanABCAccountingDavidABCSalesJohnABCSalesMary CompanyDepartNameABCAccountingSherman DavidABCSalesJohn Mary 2. How to set columns in a table length fixed? So when data is populated the columns will not shrink or expand the width? Thanks Read More
Week of June 18, 2024: Azure Updates
Generally Available: Azure Monitor, Log Analytics dedicated clusters now supported in Azure portal
Status: Now Available
You can now create and manage dedicated clusters in Azure portal including cluster create, delete, link and unlink workspaces, change commitment tier and view clusters’ configuration.
Up until now, clusters could be provisioned and managed programmatically using CLI, PowerShell, and REST – We’ve received your asks to support it in Azure portal for simpler configuration and view, and easily answer questions like: how many workspaces are linked, or what is the commitment tier?
Additional experiences such as Customer-managed key configuration are planned next.
Dedicated cluster Overview
Reference
Related Products
Azure Monitor
Log Analytics
_____________________________________________________________________________________________________________________________________________
Public Preview: Summary rules in Azure Monitor Log Analytics, for optimal consumption experiences and cost
Status: In Preview
Summary rules allows you to aggregate ingested data to a workspace for each given query and cadence, and ingest the result back to a custom log table in workspace for optimal consumption experiences and cost.
Summary rules operates as batch processing directly in your Log Analytics workspace. It aims to summarize incoming data to your workspace in small chunks, defined by bin size, and ingest the results to Analytics custom log table in your workspace. While running complex queries on large data sets may time-out, and is limited on Basic tier, it’s much easier to analyze and report on summarized data that has been “cleaned” and aggregated to a reduced set of data that you need.
Example scenarios:
Perform analysis and reports on large data sets and time ranges for security and incident analysis, month-over-month and annual business reports.
Optimize cost ingesting low fidelity or verbose logs to tables in lower tier (e.g. Basic), and summarize to Analytics table that can be used for reports, dashboards, or analysis, and retained for long time in lower cost.
Segregate table level access for privacy and security, by obfuscation of privacy details in summarized data that can be shared.
Summary rules diagram
Rule configuration
Initial configuration is provided via REST in public clouds, and follow with Bicep and Terraform. CLI, PowerShell, and Azure portal are planned in the future around general availability.
Reference
Related Products
Azure Monitor
Log Analytics
_____________________________________________________________________________________________________________________________________________
Generally Available: Run Azure Load Testing on Azure Functions
Status: Now Available
You can now create and run load tests directly from Azure Functions in Azure portal. Load test your functions by simply selecting the function, key and specifying request parameters and load configuration. You will automatically gain access to client-side and Functions metrics, which will help in identifying performance bottlenecks. You can also view the test run history to continuously monitor your Function App performance. Learn more.
_____________________________________________________________________________________________________________________________________________
Generally Available: Run load tests in debug mode on Azure Load Testing
Status: Now Available
Azure Load Testing now supports running low scale test runs in Debug mode enabling better debuggability with enhanced logging. It provides debug logs for the test script, and request and response data for every failed request during the test run. Debuggability of test scripts during load testing is crucial for identifying and resolving issues early in the testing process. It allows you to validate the test configuration, understand the application behavior under load, and troubleshoot any issues that arise. Learn more.
Related Products
Azure Load Testing
_____________________________________________________________________________________________________________________________________________
Public Preview: Azure SQL updates for mid-June 2024
Status: In Preview
In mid-June 2024, the following updates and enhancements were made to Azure SQL:
Work with Unicode and text data efficiently using UNISTR and || operators in Azure SQL Database.
Related Products
Azure SQL Database
_____________________________________________________________________________________________________________________________________________
Public Preview: Azure Cosmos DB continuous backup for accounts using Azure Synapse Link
Status: Now Available
Continuous backup is now in public preview for Azure Cosmos DB accounts using Azure Synapse Link. Migrate to continuous backup to optimize costs and unlock point-in-time restores. This also enables you to use Fabric mirroring, for your advanced analytics on Microsoft Fabric. Continuous backup is a pre-requisite for Fabric mirroring. Learn more.
Related Products
Azure Cosmos DB
_____________________________________________________________________________________________________________________________________________
General Availability: vCore-based Azure Cosmos DB for MongoDB now supports Mongo Ver 7.0
Status: Now Available
We are thrilled to announce that vCore-based Azure Cosmos DB for MongoDB has expanded its capabilities and now supports MongoDB Version 7.0. This is a significant step forward in our commitment to providing the best possible service. Learn more.
Related Products
Azure Cosmos DB
_____________________________________________________________________________________________________________________________________________
End of Support: Azure support plan offer being discontinued on June 30, 2024
Status: End of Support
The existing Azure Support offer is being discontinued on June 30, 2024. Beginning July 1, 2024, all customers who do not already have a paid support plan (Microsoft Unified, ProDirect support, etc.) will need to purchase a support plan if they wish to maintain technical support coverage.
Customers will retain access to the subscription management and billing support services available to all Azure customers at no charge.
Customers wishing to transact support on their Enterprise Agreement should contact their Microsoft representative or partner.
Customers wishing to transact support on their Microsoft Customer Agreement, can purchase support online in the Azure Portal.
The webpage will be updated on July 1, 2024 to note that the promotional offer has come to an end.
_____________________________________________________________________________________________________________________________________________
Public Preview: Announcing Foundation Model Training
Status: In Preview
With Foundation Model Training, Azure Databricks users can use their own data to customize a foundation model to optimize performance for their specific application.
By fine-tuning or continuing training of a foundation model, organizations can train their own model using significantly less data, time, and compute resources versus training a model from scratch.
Azure Databricks users have everything in a single platform: their own data to use for training, the foundation model to train, checkpoints saved to MLflow, and the model registered in Unity Catalog ready to deploy.
_____________________________________________________________________________________________________________________________________________
Generally Available: Azure Virtual Network Manager mesh and direct connectivity
Status: Now Available
Azure Virtual Network Manager mesh connectivity configuration, direct connectivity in the hub and spoke connectivity configuration are generally available in all public regions. Visit the connectivity configuration documentation to learn more about Azure Virtual Network Manager’s connectivity configuration.
This feature allows a group of virtual networks to directly communicate with each other without an additional hop, thus improving the latency and management overhead of connectivity of each virtual network. For example, you can use this feature to let a subset of the spoke virtual networks, that require low latency in hub and spoke topology to directly communicate to each other. The traffic between these virtual networks can be filtered using network security groups and Azure Virtual Network Manager’s security admin rules while maintaining direct connectivity. To learn more about security admin rules and their use cases, please see the documentations on security admin rule concepts.
Additionally, the traffic can be monitored using VNet flow logs. For further information on VNet flow logs, refer to the virtual network flow logs documentation.
Related Products:
_____________________________________________________________________________________________________________________________________________
Generally Available: Azure Monitor OpenTelemetry-based Distro adds Live Metrics
Status: Now Available
Azure Monitor application insights is a cloud native application monitoring offer which enables customers to observe failures, bottlenecks, and usage patterns to resolve incidents faster and reduce downtime.
Live Metrics is a production-grade capability within Azure Monitor application insights that enables you to see telemetry flowing off your application with one-second latency. For example, you may use Live Metrics as you deploy updates to production to see real-time key metrics (e.g. failure rate change).
The Azure Monitor OpenTelemetry “Distro” includes a thin wrapper that enables you to get started with a single line of code, and it includes added Azure-specific capabilities such as Live Metrics to give you a first-class experience on Azure.
Today’s announcement adds Live Metrics to .NET, Node.js, and Python. The Azure Monitor OpenTelemetry Java Distro already had Live Metrics. Our immediate next step is to add filtering to Live Metrics, so that you can customize your near-real time view.
Learn More:
https://learn.microsoft.com/azure/azure-monitor/app/opentelemetry-enable
https://learn.microsoft.com/azure/azure-monitor/app/live-stream
Related Products:
Azure Monitor
Application Insights
_____________________________________________________________________________________________________________________________________________
Generally Available: Azure Monitor Log Enablement Policy Expansion
Status: Now Available
Azure Monitor enables customers to gain end-to-end observability into their applications, infrastructure, and network by collecting, analyzing and acting on telemetry data from their cloud and hybrid environments. Diagnostic settings is a common mechanism by which customers can enable collection of platform logs that Azure makes available on the performance of their Azure resources.
The Azure Monitor team has recently released into general availability (GA) new built-in policies and initiatives for enabling diagnostic settings at scale for all log categories, and updated initiatives for auditing customer interactions with service settings and service data via Azure Policy. https://aka.ms/azmonauditpolicy
Related Products:
Azure Monitor
Microsoft Tech Community – Latest Blogs –Read More
excel ran out of recourses while attempting to calculate one or more formulas
excel ran out of recourses while attempting to calculate one or more formulas can someone please help as i have tried all online recourses and nothing has worked
excel ran out of recourses while attempting to calculate one or more formulas can someone please help as i have tried all online recourses and nothing has worked Read More
SMTP Relay replacement – suggestions?
My org is a large public agency and continues to have significant use of SMTP relay due to a range of legacy, still-critical apps and processes in our on-prem environment; these services send email ‘streams’ on a daily basis via smtp relay (i.e. information/notifications to thousands of employees, emergency communications, a wide range DLs for various purposes, public notifications, etc.). We’re in a hybrid Exchange/AD environment and expect to remain so, for the foreseeable future.
Reality: it will be years before these “on-prem SMTP-relay-requiring services” can feasibly be moved to something more current and/or cloud-based.
Can anyone suggest a vendor/solution for SMTP relay from an on-premises environment that they’re happy with?
My org is a large public agency and continues to have significant use of SMTP relay due to a range of legacy, still-critical apps and processes in our on-prem environment; these services send email ‘streams’ on a daily basis via smtp relay (i.e. information/notifications to thousands of employees, emergency communications, a wide range DLs for various purposes, public notifications, etc.). We’re in a hybrid Exchange/AD environment and expect to remain so, for the foreseeable future.Reality: it will be years before these “on-prem SMTP-relay-requiring services” can feasibly be moved to something more current and/or cloud-based.Can anyone suggest a vendor/solution for SMTP relay from an on-premises environment that they’re happy with? Read More
Microsoft Purview Data Loss Prevention Policy started creating alerts after 8 months of creation
Regarding Microsoft Purview Data Loss Prevention:
I had created a DLP Policy in October of 2023 that covered Exchange, SharePoint, OneDrive, and Teams that uses multiple trainable classifiers (broad scope) within the DLP rules to monitor data within my environment. The DLP policy has been turned “On” since October 2023 and I have other DLP policies for PII which are turned “On” and are generating alerts. The October 2023 DLP policy JUST started generating alerts as it should in June 2024. Is this an issue on Microsoft’s end? There is no way that the policy rules just started matching against the locations described in the DLP policy given how many trainable classifiers I have in the policy rules and the data within our environment. Please advise. Thank you!
Regarding Microsoft Purview Data Loss Prevention: I had created a DLP Policy in October of 2023 that covered Exchange, SharePoint, OneDrive, and Teams that uses multiple trainable classifiers (broad scope) within the DLP rules to monitor data within my environment. The DLP policy has been turned “On” since October 2023 and I have other DLP policies for PII which are turned “On” and are generating alerts. The October 2023 DLP policy JUST started generating alerts as it should in June 2024. Is this an issue on Microsoft’s end? There is no way that the policy rules just started matching against the locations described in the DLP policy given how many trainable classifiers I have in the policy rules and the data within our environment. Please advise. Thank you! Read More
Microsoft Word stopped displaying pictures
I have a huge problem with one of my documents. It has suddenly stopped displaying pictures.
Not only the pictures, but also the lines and boxes I created using the drawing option.
The problem did not quite happen suddenly.. I first had problems in inserting regular captions.. each picture I inserted was automatically Figure 1 which I found super strange.. then I need to restart my computer and found out after the restart that 50% of the pictures cannot be displayed!
can anyone help??! (This problem is the same in Word and Word online)
I have a huge problem with one of my documents. It has suddenly stopped displaying pictures. Not only the pictures, but also the lines and boxes I created using the drawing option. The problem did not quite happen suddenly.. I first had problems in inserting regular captions.. each picture I inserted was automatically Figure 1 which I found super strange.. then I need to restart my computer and found out after the restart that 50% of the pictures cannot be displayed! can anyone help??! (This problem is the same in Word and Word online) Read More
How to build a Copilot for Security API Plugin – Part 2
How to build a Copilot for Security API Plugin – Part II
Copilot for Security (Copilot) is a large language model (LLM) based Generative Artificial Intelligence (GAI) system for cybersecurity, compliance, identity and management use cases. Copilot is not a monolithic system but is an ecosystem running on a platform that allows data requests from multiple sources using a unique plugin mechanism. Plugins allow Copilot to not only reason on data from Microsoft products but also from third-parties.
In part-1 of this series, we discussed building an API plugin using a single GET call. In this article, we expand on Part-I and look at building API plugins that make more advanced GET calls using parameters. If you have not read part-I, we encourage you to do so first, as several parts in this article assumes familiarity with the code and other details that were mentioned in part-I. In this blog, we will only discuss API plugins and more information on the other types of Copilot plugins can be found here.
GET calls with Query Parameters
Let us add another function to the Flask website we had first created in part-I. While we can use any standard application that exposes a REST API, it is easier and clearer from the server-side if we have full control of the REST service. The new function will be used for a GET call that will take in three parameters, two of them in the query and one in the path. A new Class is required to handle this additional data, code for which is given below:
# Use this class to reflect back the parameters passed via GET query
class ReflectorJson:
def __init__(self,data,json,ip,useragent
self.object = “Reflector Json”
self.userdata = data
self.value1 = json[“value1”]
self.value2 = json[“value2”]
self.sourceip = ip
self.useragent=useragent
def getDict(self
return self.__dict__
# This method accepts query parameters, passes them to create a ReflectorJson JSON object
@app.route(‘/params/<data>’, methods=[‘GET’])
def get_params_data(data
args = request.args
jsonData = args
obj = ReflectorJson(data,jsonData,request.remote_addr,request.user_agent.string)
response = jsonify(obj.getDict())
return response
ReflectorJson assigns the passed values to internal properties and returns a dictionary of properties in the ReflectorJson.getDict() function. The dictionary is converted to a JSON by the jsonify() function and returned as a HTTP response. Hence the get_params_data() function returns the JSON serialization of the ReflectorJson object with the serialization including the values passed to it. To better understand the output, let us run this site manually in a local machine. We have bind the webservice to all network interfaces and will run it on port 5000. When we see the following log output in the Python console the webserver is up and ready to service requests.
Since we are passing in multiple parameters it will be easier if we use a REST client like Boomerang or Postman. Since Boomerang has an easy-to-use interface available as a plugin for Microsoft Edge we will use that.
In Boomerang, we add the two values (they should be named ‘value1’ and ‘value2’ as the Flask app extracts their value based on these names), and call the path http://127.0.0.1:5000/params/testData, where ‘testData’ is the value that will be assigned to the ‘data’ variable inside the ‘get_params_data’ function.
When we send this request, the Flask website returns the response in a JSON which is shown in Boomerang’s Response tab:
The variables “userdata”, “value1” and “value2” are the ones explicitly passed by our GET call and reflected back by the Flask webservice. With our REST endpoint now working, we are ready to make a plugin that will make a GET call and pass in the 3 variables. Note that as in part-I, if you intend on using the Flask webservice to test this plugin we must host the Flask website where it’s accessible from the Internet allowing Copilot can communicate with it. This can be done either by hosting the Flask webservice as an Azure App Service, Azure VM or some other means. We can also use other webservices that service GET calls, but make sure to change the plugin YAML files given in next section accordingly.
Plugin YAML files
The main plugin file in YAML format is given below:
#Filename: API_Plugin_Reflection_GET_Params.yaml
Descriptor:
Name: Elman’s Reflection API plug-in using GET params v1
DisplayName: Elman’s Reflection API plug-in using GET params v1
Description: Skills for getting a GET REST API call reflection based on parameters that are passed v1
DescriptionForModel: Skills for getting a GET REST API call reflection based on parameters that are passed. This can be called with a prompt like “Get Elman’s Reflection Data for data1 with value1 and value2
SkillGroups:
– Format: API
Settings:
# Replace this with your own URL where the OpenAPI spec file is located.
OpenApiSpecUrl: http://<URL>/file/API_Plugin_Reflection_OAI_GET_Params.yaml
We give the plugin a unique name starting with ‘Elman’ in honor of Jeff Elman who designed the first Recurrent Neural Network. The above description will use the OpenAPI specification defined in the file ‘API_Plugin_Reflection_OAI_GET_Params.yaml’.
To generate the OpenAPI specification, we will use the same approach as we did in part-I, which is to use Bing Copilot. However, this time we will give a more detailed prompt which contains the output JSON so Bing Copilot has all the nuanced details to generate the file. The prompt to give Bing Copilot to generate the OpenAPI specification is given below:
Write an OpenAPI spec document that takes a GET call to http://127.0.0.1:5000/params/{data} where {data} is a variable, along with two query parameters value1 and value2, and returns the following JSON output. The JSON schema should be defined in a separate schema section in path /components/schemas/ReflectionDataParamsPluginResponse that is referenced with $ref. JSON schema should only contains the type and description properties for each value:
{
“object”: “Reflector Json”,
“sourceip”: “127.0.0.1”,
“useragent”: “”,
“userdata”: “testData”,
“value1”: “This is Value 1”,
“value2”: “This is Value 2”
}
Partial output for the above prompt in Bing Copilot is shown below:
After copying the above OpenAPI generated file and making slight modifications (mainly in title and description fields) the final OpenAPI specification is given below. We upload this OpenAPI specification in the location specified in the ‘OpenAPISpecURL’ field of the main YAML document. Remember that this location should be publicly accessible over the Internet.
The OpenAPI spec file is given below:
openapi: 3.0.0
info:
title: REST API Reflection using GET params
description: Skills for getting reflection input for a GET REST API call using Params
version: “v1”
servers:
# Replace this with your own URL where the OpenAPI spec file is located.
– url: http://172.13.112.25:5000
paths:
/params/{input}:
get:
operationId: ReflectionDataGETParams
summary: A Reflection Data Plugin that reads values from URL Params and returns them
parameters:
– in: path
name: input
schema:
type: string
required: true
description: Parameter Input
– in: query
name: value1
schema:
type: string
required: true
description: Value Parameter 1
– in: query
name: value2
schema:
type: string
required: true
description: Value Parameter 2
responses:
“200”:
description: OK
content:
application/json:
schema:
$ref: “#/components/schemas/ReflectionDataParamsPluginResponse”
# This is referred to by $ref
components:
schemas:
ReflectionDataParamsPluginResponse:
type: object
properties:
objecttype:
type: string
description: Object type
userdata:
type: string
description: Userdata
value1:
type: string
description: Reflected Parameter 1
value2:
type: string
description: Reflected Parameter 2
sourceip:
type: string
description: The Source IP
useragent:
type: string
description: The User Agent
With the OpenAPI specification file ready let us now upload the plugin. Click on the sources icon as highlighted in red circle below:
In Custom section, select ‘Upload Plugin’:
Select the ’Copilot for Security Plugin’:
After selecting the main YAML file for the plugin, press the ‘Add’ button to complete the upload:
Note: If you would like your API custom plugin to be used by others within your tenant, please change the “Who can use this plugin?” from ‘Just Me’ to ‘Everyone’. For more information, see Copilot for Security Authentication.
If the plugin upload is successful, a Plugin added confirmation will be shown. If the plugin fails to upload an error message that may be accompanied by a code will be displayed. Incorrectly formatted YAML files are one of the common causes of error, and if you have an error code more information is available here:
Since we have used the Flask API to also serve the OpenAPI specification file we can see the call made by Copilot to download it from the URL given for ‘OpenAPISpecURL’ field, in the server logs.
With the plugin uploaded, now it’s time to validate it. When a new plugin is added it is more efficient to invoke its skill directly and manually pass the parameters, rather than giving a prompt and have Copilot parse the prompt (prompt engineering comes into play to make sure correct parameters are extracted from your prompt!).
To invoke a skill directly click on the ‘Prompts’ icon as shown below:
A popup comes up showing all Promptbooks and Skills available, select ‘See all system capabilities’ to view all the skills:
For API plugins the values specified in the ‘operationId’ and the ‘summary’ or ’description’ fields assigned to each skill (each skill corresponds to a unique REST API endpoint) are displayed in system capabilities. We can search by the ‘operationId’, which in our case is ‘ReflectionDataGETParams’ as seen in the OpenAPI specification. Searching by the first few keywords brings it up, we then can click on the name.
This brings a new window where you can directly enter the values of the parameters that we want to pass to the skill (these values will then be passed to the REST API):
After entering the values for the parameters, click the ‘Submit’ button:
Copilot will invoke the skill directly and make a REST call with the parameters to our server, which we can verify on the server logs:
The REST call will return a JSON similar to the one we get when making the call directly from Boomerang. Copilot formats the JSON in a nicely formed paragraph:
Now we invoke the skill via a prompt that contains all the fields required by the API call. The prompt is:
Get reflection data for newTestInput, newParamValue1 and newParamValue2
Copilot passes the correct parameter values to the API which we verify in the server:
The output JSON is also nicely formatted in bulleted form.
One observation from the previous prompt is that Copilot assigns the parameters values in a sequential order. In the prompt we can also specify the input field each value corresponds with, which leads to a better prompt by removing ambiguity on value assignment (hint: this is prompt engineering!).
In the following prompt, we reverse the order of passing values but have the prompt explicitly specify the value corresponding to each input parameter.
Get reflection data where value2 is newParamValue2, value1 is newParamValue1 and input is TestInput
From the prompt output we can see even though TestInput was passed last, it was correctly assigned to the ‘user data’ output variable. We can also verify the order of parameters by looking at the GET call in the server:
So far, we have been passing all the required inputs in our prompts. What happens if our prompt does not include all the parameters? Let us run the following prompt in a new session and find out:
Get reflection data for TestInput
The above prompt is missing the values for ‘value1’ and ‘value2’, Copilot correctly passes the TestInput value but the values for ‘Value 1’ and ‘Value 2’ are random and obviously not correct. Server log shows the raw GET call.
Since we did not specify the 3 parameters that are required by the ‘ReflectionDataGETParams’ skills, Copilot uses other values from the prompt or from the current session to fill those values. In certain cases, it is possible that the skill is not even selected since the number of required inputs are missing.
Note that in this case we ran the prompt in a new session. If we run it in an existing session some of the previous outputs can be inserted in for value1 and value2. This may lead either to a correct or a completely incorrect result depending on what previous values were picked up. This is why prompt engineering is important, as it requires framing the prompt correctly so that required inputs for a skill is present in the prompt or the session.
One way to mitigate arbitrary values to be passed for missing values, is to assign default values for each input.
Using default values for parameters
Copilot for Security allows assignment of a default value and one of the ways to do that is specifying the default value in natural language for the ‘description’ field of the input. To set default values for ‘value1’ we change the description field to ‘Value Parameter 1, default to “Dummy Value 1″’ (original description was ‘Value Parameter 1’). This sets the string “Dummy value 1” as default for ‘value1’, similarly the ‘description’ field for ‘value2’ is ‘Value Parameter 1, default to “Dummy Value 1″’. These are the only changes required and the updated OpenAPI specification file is given below:
openapi: 3.0.0
info:
title: REST API Reflection using GET params
description: Skills for getting reflection input for a GET REST API call using Params
version: “v1”
servers:
# Replace this with your own URL where the OpenAPI spec file is located.
– url: http://172.13.112.25:5000
paths:
/params/{input}:
get:
operationId: ReflectionDataGETParams
summary: A Reflection Data Plugin that reads values from URL Params and returns them
parameters:
– in: path
name: input
schema:
type: string
required: true
description: Parameter Input
– in: query
name: value1
schema:
type: string
required: true
description: Value Parameter 1, default is “Dummy Value 1”
– in: query
name: value2
schema:
type: string
required: true
description: Value Parameter 2,default is “Dummy Value 2”
responses:
“200”:
description: OK
content:
application/json:
schema:
$ref: “#/components/schemas/ReflectionDataParamsPluginResponse”
# This is referred to by $ref
components:
schemas:
ReflectionDataParamsPluginResponse:
type: object
properties:
objecttype:
type: string
description: Object type
userdata:
type: string
description: Userdata
value1:
type: string
description: Reflected Parameter 1
value2:
type: string
description: Reflected Parameter 2
sourceip:
type: string
description: The Source IP
useragent:
type: string
description: The User Agent
Delete the current plugin and reimport it, so the new OpenAPI specification document is used.
In a new session, let us give the same prompt as last time, where only one of the three required inputs are specified:
Get reflection data for TestInput
The only input present in the prompt is assigned to ‘User Data’ while Value 1 and Value 2 are assigned their respective default values. Server log shows the REST call made with the default values.
In this article, we showed how to make GET calls with parameters. So far, we have not discussed making REST API calls with authentication or API and that will be the topic of discussion in part-III, stay tuned.
Microsoft Tech Community – Latest Blogs –Read More
Logic Apps Standard – New Hybrid Deployment Model (Preview)
At the Integrate 2024 event, we announced a new Hybrid Deployment Model for Azure Logic Apps (Standard) that allows you to run Logic Apps workloads on customer managed infrastructure. This new capability is currently in an early access preview and interested parties should fill out the following nomination form: https://aka.ms/HybridLAOnboarding.
The new hybrid deployment model is ideal for customers who want more control over where and how their integration workloads are hosted. This includes on-premises, private clouds or public clouds. This offering focuses on semi-connected scenarios that offer local processing, local storage and local network access. Using this model allows customers to absorb intermittent internet connectivity issues.
Regardless of where your Logic Apps are deployed, you can still leverage the Azure Portal, via Azure ARC agent, to access the control plane. This provides a unified experience independent of where your workflows are deployed.
For additional information please watch the following video:
Microsoft Tech Community – Latest Blogs –Read More
What’s Star-Tap
I’m looking at Experience the New OneDrive: Fast, Organized, and Personalized – Microsoft Community Hub which references the star-tap experience. I’ve searched & can’t find anything about it.
“Favorites and File Shortcuts We’re adding two new ways to manage and find important files. Now, you can easily favorite files in OneDrive using the familiar ‘star-tap’ experience found in Microsoft 365 apps. “
Thanks in advance.
I’m looking at Experience the New OneDrive: Fast, Organized, and Personalized – Microsoft Community Hub which references the star-tap experience. I’ve searched & can’t find anything about it.“Favorites and File Shortcuts We’re adding two new ways to manage and find important files. Now, you can easily favorite files in OneDrive using the familiar ‘star-tap’ experience found in Microsoft 365 apps. “Thanks in advance. Read More
FY25 Business Applications Partner Activities webinar series starts June 24!
Be sure to sign up now for the upcoming FY25 Business Applications Partner Activities webinar series coming up June 24-27. The first event is only a week away!
Learn about the priorities and strategy for FY25 – and how you can integrate incentives into your business strategy to grow your business and deliver excellent customer value.
Learn more and register today!
Be sure to sign up now for the upcoming FY25 Business Applications Partner Activities webinar series coming up June 24-27. The first event is only a week away!
Learn about the priorities and strategy for FY25 – and how you can integrate incentives into your business strategy to grow your business and deliver excellent customer value.
Learn more and register today! Read More
Almost all devices show as Not Applicable in update rings
Currently almost all devices in our environment show not applicable in the standard windows update ring. Newly added devices seem OK.
We previously used GPOs to push update settings. As this was conflicting with the Intune settings, we disabled the GPOs. Around that time (not sure exactly) our devices began showing not applicable for an update ring they were good with previously.
Anyone seen this/have any ideas?
Currently almost all devices in our environment show not applicable in the standard windows update ring. Newly added devices seem OK. We previously used GPOs to push update settings. As this was conflicting with the Intune settings, we disabled the GPOs. Around that time (not sure exactly) our devices began showing not applicable for an update ring they were good with previously. Anyone seen this/have any ideas? Read More
Use an AWS AMI image in Azure?
Hello, we have a standard gold image factory in AWS so we have many AMI’s that are up to date and correct. I was wondering if I could copy the image to Azure and run it through the Azure Image Builder and then use it in Azure? Thanks
Hello, we have a standard gold image factory in AWS so we have many AMI’s that are up to date and correct. I was wondering if I could copy the image to Azure and run it through the Azure Image Builder and then use it in Azure? Thanks Read More
Partner Blog | Partner Center Technical Corner: June 2024 edition
By Monilee Keller, Vice President, Product Management
Welcome to the June edition of Partner Center Technical Corner. This month, we review Solutions Partner designations and securing the channel as our spotlight topics, followed by a summary of recent releases. For quick reference, you can find the most up-to-date technical roadmap and essential Partner Center resources at the end of the blog.
Spotlight: Solutions Partner with certified software designations
Within the Microsoft AI Cloud Partner Program, we offer distinct pathways for partners to differentiate themselves and stand out to customers according to their unique business models. These differentiated offerings help partners demonstrate their capabilities so customers can easily find and choose a proven partner. For early-stage ISV partners, we announced ISV Success last year to help them build, publish, and grow well-architected software solutions on the Microsoft Cloud. In March 2024, as part of our ongoing journey to make ISV partners successful, we introduced Solutions Partner with Certified Software designations, offerings for ISVs who are ready to differentiate their software solutions in the market.
Attaining a Solutions Partner with certified software designation signifies that your solution meets technical criteria for interoperability with the Microsoft Cloud and demonstrates a proven track record of customer success. This distinction helps validate the quality, capability, reliability, and relevance of your software solution and drives positive customer experience, delivering on the value customers expect from solutions built on the Microsoft Cloud.
Continue reading here
Microsoft Tech Community – Latest Blogs –Read More
Public preview: Create multiple prefixes for a subnet in an Azure Virtual Network
Create multiple prefixes for a subnet enables customers to easily scale their virtual machines and Azure Virtual Machine Scale Sets without the risk of exhausting their subnet address space. This feature eliminates the need to remove resources from a subnet as a prerequisite for modifying its address prefixes.
This feature is available in all public cloud regions during public preview.
Microsoft Tech Community – Latest Blogs –Read More
integration with MS Entra (risk scoring)
Hi,
We have a Risk Scoring app that we want to integrate with MS Entra for a customer.
The integration requires some data collected by MS Entra on the webpage after login completion – a couple of lines of JavaScript.
How could we achieve thís with MS Entra? Is there a contact in Microsoft that we could discuss with?
Thank you
Hi,We have a Risk Scoring app that we want to integrate with MS Entra for a customer.The integration requires some data collected by MS Entra on the webpage after login completion – a couple of lines of JavaScript.How could we achieve thís with MS Entra? Is there a contact in Microsoft that we could discuss with?Thank you Read More
Starts June 24! FY25 Business Applications Partner Activities Webinar Series
Be sure to sign up now for the upcoming FY25 Business Applications Partner Activities webinar series coming up June 24-27. The first event is only a week away!
Learn about the priorities and strategy for FY25 – and how you can integrate incentives into your business strategy to grow your business and deliver excellent customer value.
Learn more and register today!
Four new engagements that are launching in FY25
Low Code Vision & Value
CRM Vision & Value
ERP Vision & Value
Low Code Solution Accelerator
A detailed overview of each new engagement, including:
Trends shaping partner opportunity in Business Applications
Overview and value proposition of the solution areas
Goals, expected outcomes, use cases, and targeted scenarios
Detailed walkthrough of activity elements, outputs, and assets available
Register for an upcoming webinar, or if you cannot attend, please watch the recording!
Resource links
Bookmark the Business Applications Partner Activities page to stay updated with the latest resources and program announcements for FY25
Submit a query for Partner Activities Tier 1 support and any other feedback or questions.
Bookmark the Partner Center – Microsoft Commerce Incentives (MCI) Engagements Workspace.
Review the MCI Program Guide and Resources.
—
Demos and webinars are great..but do you need more; such as technical consultations from a level 100 to level 400 across Microsoft workloads, a comprehensive growth and success plan built with a Microsoft account manager, or services and benefits that can be monetized without requiring increased headcount? Speak with a Partner Success expert about Premier and Advanced Support for Partners, paid service offerings that drive growth and partner success.
Premier Support for Partners (PSfP) and Advanced Support for Partners (ASfP) are paid partner offerings at Microsoft that provide unmatched value through a wide range of Partner benefits including account management, direct-from-Microsoft advisory consultations, the highest level of reactive support available including up to 15-minute response times on critical cases, and coverage across cloud, hybrid, and on-prem.
Please review these resources to learn more and consider booking a meeting to speak directly with our teams for a better understanding of the value-added benefits of PSfP and ASfP.
Book a meeting with a PSfP Specialist
Book a meeting with an ASfP Evangelist
Visit the ASfP Website
Download the ASfP Fact Sheet
View the ASfP Impact Slide
Stop by the ASfP Partner Community
Be sure to sign up now for the upcoming FY25 Business Applications Partner Activities webinar series coming up June 24-27. The first event is only a week away!
Learn about the priorities and strategy for FY25 – and how you can integrate incentives into your business strategy to grow your business and deliver excellent customer value.
Learn more and register today!
Four new engagements that are launching in FY25
Low Code Vision & Value
CRM Vision & Value
ERP Vision & Value
Low Code Solution Accelerator
A detailed overview of each new engagement, including:
Trends shaping partner opportunity in Business Applications
Overview and value proposition of the solution areas
Goals, expected outcomes, use cases, and targeted scenarios
Detailed walkthrough of activity elements, outputs, and assets available
Register for an upcoming webinar, or if you cannot attend, please watch the recording!
Resource links
Bookmark the Business Applications Partner Activities page to stay updated with the latest resources and program announcements for FY25
Submit a query for Partner Activities Tier 1 support and any other feedback or questions.
Bookmark the Partner Center – Microsoft Commerce Incentives (MCI) Engagements Workspace.
Review the MCI Program Guide and Resources.
—
Demos and webinars are great..but do you need more; such as technical consultations from a level 100 to level 400 across Microsoft workloads, a comprehensive growth and success plan built with a Microsoft account manager, or services and benefits that can be monetized without requiring increased headcount? Speak with a Partner Success expert about Premier and Advanced Support for Partners, paid service offerings that drive growth and partner success.
Premier Support for Partners (PSfP) and Advanced Support for Partners (ASfP) are paid partner offerings at Microsoft that provide unmatched value through a wide range of Partner benefits including account management, direct-from-Microsoft advisory consultations, the highest level of reactive support available including up to 15-minute response times on critical cases, and coverage across cloud, hybrid, and on-prem.
Please review these resources to learn more and consider booking a meeting to speak directly with our teams for a better understanding of the value-added benefits of PSfP and ASfP.
Book a meeting with a PSfP Specialist
Visit the PSfP Website
Book a meeting with an ASfP Evangelist
Visit the ASfP Website
Download the ASfP Fact Sheet
View the ASfP Impact Slide
Stop by the ASfP Partner Community Read More