Tag Archives: microsoft
Configure View of a List that User sees only his data
Hello
I have a big list.
The goal is to send the link of the list to users and let them view only their data.
Regards
JFM_12
Hello I have a big list.The goal is to send the link of the list to users and let them view only their data.RegardsJFM_12 Read More
Read Only Access to Project Schedule for another PM
Hi There!
One of our project has some dependancies with another team’s project. The other team’s Project Manager requires read-only access to Project Schedule check the project activity % complete and who is working on which activity.
How can you provide the Read Only access to other PM? We use Project Online to manage the Portfolio.
Thanks in advance.
Shri
Hi There!One of our project has some dependancies with another team’s project. The other team’s Project Manager requires read-only access to Project Schedule check the project activity % complete and who is working on which activity. How can you provide the Read Only access to other PM? We use Project Online to manage the Portfolio. Thanks in advance.Shri Read More
Edge Management Service: ‘Customization Settings’ tab / Enterprise secure AI settings confusing
In Edge Management Service when you create a new Configuration Profile, the ‘Customization Settings’ tab is available. On the ‘Enterprise Secure AI’ page of this tab, all settings are blue/checked giving the impression that these settings are enabled and active. But they are not.
For a new Configuration Profile the related settings are not available/present in the Policies tab. This gives the impression that (from the ‘Enterprise Secure AI’ perspective) the settings are enabled but in reality they are not because the policies are not present in the Policies tab. This is confusing.
This behavior-by-design could be changed, suggestion:
Check box = Grey/unchecked: Not configured = setting not in Policies tab (Edge defaults apply)Check box = Blue/unchecked: Configured = setting in Policies tab = ‘disabled’Check box = Blue/checked: Configured = setting in Policies tab = ‘enabled’
A consideration to take by the Development team for Edge Management Service.
All comments welcome 😉
In Edge Management Service when you create a new Configuration Profile, the ‘Customization Settings’ tab is available. On the ‘Enterprise Secure AI’ page of this tab, all settings are blue/checked giving the impression that these settings are enabled and active. But they are not. For a new Configuration Profile the related settings are not available/present in the Policies tab. This gives the impression that (from the ‘Enterprise Secure AI’ perspective) the settings are enabled but in reality they are not because the policies are not present in the Policies tab. This is confusing. This behavior-by-design could be changed, suggestion:Check box = Grey/unchecked: Not configured = setting not in Policies tab (Edge defaults apply)Check box = Blue/unchecked: Configured = setting in Policies tab = ‘disabled’Check box = Blue/checked: Configured = setting in Policies tab = ‘enabled’A consideration to take by the Development team for Edge Management Service. All comments welcome 😉 Read More
Acting on Real-Time data using custom actions with Data Activator
Being able to make data driven decisions and act on real-time data is important to organizations because it enable them to either avert crisis in systems that monitor product health and take other actions based on their requirements. For an example, a shipping company may want to monitor their packages and act in real-time when the temperature of the packages becomes too hot. One way of monitoring and acting on data is to use Data Activator, which is a no-code experience in Microsoft Fabric for automatically taking actions when the condition of the package temperature is detected in the data.
Let’s look at Contoso Bikes is a fictitious bike rental company that has a real-time dataset of all bikes available in the docking stations scattered across different neighborhoods in the city. They want to optimize the availability and distribution of their bikes based on the demand and supply patterns. They want to automatically trigger alerts, notifications, or workflows when certain conditions are met in the data, high or low availability of bikes in a certain neighborhood to action job requests to increase the number of available bikes at docking stations.
Using custom actions to trigger alerts with Data Activator
To achieve this, you can set up custom actions with Data Activator to trigger alerts that will enable Contoso Bikes to act on the data. Custom actions in Data Activator are reusable action templates that you can use in multiple triggers, in multiple Reflex items. A custom action defines how to call a specific external system from a Data Activator trigger using a flow
1 – Set a trigger and create a custom action
An alert needs to be already set on your data, you can follow these steps to get and set one using the sample bike data.
Inside of your alert, set the trigger condition to select the No_Bikes property from the dataset
Then set the condition to check when the number of bikes drops to less than 5, so that the trigger can run when the condition is met. This can also be filtered down further, for example you can add a filter that checks for a specific Neighbourhood.
Lastly, create a custom action by selecting Custom Action then + Create using the top navigation bar. Give your custom action a name, then add input fields which are used as input when triggering alerts. Lastly copy the connection string.
Select Create flow in Power Automate, this will navigate to Power Automate. You can add any action then save the flow so that you can come back to Data Activator.
In Data Activator, scroll to the Act section and choose your custom action then fill out the required information for properties. You can create corresponding properties by using the New Property at the top navigation bar.
Select Save.
2 – Configure a custom action in Power Automate
To trigger and send actionable alerts to Contoso Bikes employees, you can use Power Automate as a custom action. Then you will be able to have a notification that is sent to Contoso Bike with an option to mark a job request completed after the bike docking stations have been restocked.
In the Power Automate flow created after creating a custom action, update the When a Data Activator trigger fires with the connection string.
Add an action to Create Item, this will allow Contoso Bikes to receive and store job requests in a SharePoint list. Update it with the relevant fields including the ones from Data Activator, use this format to get values from the trigger triggerOutputs()?[‘body/inputFields/Bikepoint’]. You can opt to us another data source.
Add an action to Post an Adaptive Card and wait for a response in a Teams channel to alert Contoso bikes and take action. Update the message property for this action with the below JSON schema.
{
“type”: “AdaptiveCard”,
“$schema”: “http://adaptivecards.io/schemas/adaptive-card.json”,
“version”: “1.2”,
“body”: [
{
“type”: “Container”,
“items”: [
{
“type”: “TextBlock”,
“text”: “Not enough bikes at the docking stations”,
“wrap”: true,
“style”: “heading”,
“id”: “lblDisplay”
}
]
},
{
“type”: “TextBlock”,
“text”: “Please see and increase number of bikes in the docking stations below”,
“wrap”: true,
“size”: “Small”,
“weight”: “Lighter”,
“color”: “Default”,
“isSubtle”: true,
“id”: “lblDescription”
},
{
“type”: “FactSet”,
“facts”: [
{
“title”: “Bikepoint:”,
“value”: “@{triggerOutputs()?[‘body/inputFields/Bikepoint’]}”
},
{
“title”: “Neighbourhood:”,
“value”: “@{triggerOutputs()?[‘body/inputFields/Neighbourhood’]}”
},
{
“title”: “No. of bikes:”,
“value”: “@{triggerOutputs()?[‘body/inputFields/No of Bikes’]}”
},
{
“title”: “No. of Empty docks:”,
“value”: “@{triggerOutputs()?[‘body/inputFields/Empty Docks’]}”
},
{
“title”: “Street:”,
“value”: “@{triggerOutputs()?[‘body/inputFields/Street’]}”
},
{
“title”: “Job No.”,
“value”: “@{body(‘Create_a_job_item’)?[‘Title’]}”
}
]
},
{
“type”: “ActionSet”,
“actions”: [
{
“type”: “Action.Submit”,
“title”: “Completed”,
“id”: “btnCompleted”
}
]
}
]
}
Lastly, add an action Update item to change the job request status from Not Started to Completed after the completed button on the adaptive card alert has been selected to indicate the docking stations have sufficient bikes available. Make sure to have proper naming conventions as best practice.
To sum up, by creating custom actions you can act on data in real time and across different platforms. You can also customize the actions to suit your specific needs and scenarios. With Data Activator, you can turn your data into actions and achieve your business goals.
Other resources
Get started with Data Activator training
Get started with Real-Time Intelligence
Get started with Data Activator documentation
Microsoft Tech Community – Latest Blogs –Read More
Use Cases for Testing Restrictoutboundnetworkaccess for Speech Service
Transcribing an Audio File from a Storage Account Using the Speech Service
UseCase 1 – Azure Speech Service Outbound access not restricted
Prepare the Audio File: Upload the audio file to your storage account and note its URL. We can take sample file from here: cognitive-services-speech-sdk/sampledata/audiofiles at master · Azure-Samples/cognitive-services-speech-sdk · GitHub
Set Up the Speech Service: Obtain the API key and endpoint URL from your speech service in Azure as below reference
3. Make the GET Request: Follow below steps in Postman to make a POST request to Speech Service
Open Postman and create a new POST request.
Set the URL to https://<SpeechServiceLocation>.api.cognitive.microsoft.com/speechtotext/v3.2/transcriptions.
Add Headers:
Ocp-Apim-Subscription-Key: <keyOfSpeechService>
Content-Type: application/json
Set the Body to raw and select JSON format.
Then, paste the following JSON:
{
“contentUrls”: [
“SASLinkToAudioFileOnStorage”
],
“locale”: “en-US”,
“displayName”: “My Transcription”,
“model”: null,
“properties”: {
“wordLevelTimestampsEnabled”: true,
“languageIdentification”: {
“candidateLocales”: [
“en-US”, “de-DE”, “es-ES”
]
}
}
}
The Post Request will return a status code of 201 as shown – it indicates that the request was successfully processed, and a new transcription job has been created. This status code confirms that the transcription process has been initiated
In the response body of the POST request, find the URL provided under the ‘Links’ section and make a GET request to that URL.
The response from this GET request will contain a contentUrl, which you need to use to make another request to fetch the transcribed data
Since the outbound access was not disabled, we were able to fetch the Transcribed data from speech service.
UseCase 2 – Azure Speech Service Outbound access is restricted
Repeat the steps as mentioned in UseCase1 to send POST request to Speech Service.
The request will return a status code of 403.This means that we are not allowed to access Audio File from Storage account because here we have mentioned – restrictOutboundNetworkAccess”: true, and “allowedFqdnList”: “microsoft.com” which means we have restricted outbound access and speech service can only access “microsoft.com” only.
Microsoft Tech Community – Latest Blogs –Read More
High costs for MS Syntex – although restricted to some sites – maybe open for a short time
Hi everyone,
we are experiencing high costs for MS Syntex. We just wanted it to use for some sites – so i restricted it to 2 sites.
But i did not see that i need to restrict each service – so some users seem to “use” it.
How can i identify these sites and stop them from using any of the services?
How can i completely disabled Syntex for our tenant if nothing else works?
BR
Stephan
Hi everyone, we are experiencing high costs for MS Syntex. We just wanted it to use for some sites – so i restricted it to 2 sites.But i did not see that i need to restrict each service – so some users seem to “use” it.How can i identify these sites and stop them from using any of the services?How can i completely disabled Syntex for our tenant if nothing else works? BRStephan Read More
[MSFT Defender for Cloud] Connection Defender for cloud and AWS account with free 30-day period
Hello.
I’m a cloud security engineer.
We are preparing to link MSFT MDC to enhance the security of AWS environment.
I leave a post because there is a pre-linked inquiry.
1. MDC offers a 30-day free period for Azure subscriptions. However, we couldn’t find any mention of free offers for AWS accounts. Does MDC offer free usage for AWS?
2. Are API query costs and log collection and archiving costs incurred separately other than MDC costs?
Please answer me.
Thank you.
Hello.I’m a cloud security engineer. We are preparing to link MSFT MDC to enhance the security of AWS environment. I leave a post because there is a pre-linked inquiry. 1. MDC offers a 30-day free period for Azure subscriptions. However, we couldn’t find any mention of free offers for AWS accounts. Does MDC offer free usage for AWS? 2. Are API query costs and log collection and archiving costs incurred separately other than MDC costs? Please answer me. Thank you. Read More
The same document looks different in SharePoint than the desktop App (word)
In SharePoint online word view number bullets start like 1.0,1.1,1.2,…. But in desktop view, it’s like 1.1,1.2,1.3,….
I have tried the below troubleshooting steps.
1. Modified the site’s permission hierarchy to specific user. When the bullets that begin with 1.0, 1.1, 1.2, and so on are granted access to the site by the owner, member, and site administrator. The document is correct in both the desktop and SharePoint views when I remove all site permissions and grant read access.
2. Used a new browser(We are normally use Microsoft Edge browser) to visit this same SharePoint location. An identical problem was encountered.
3. Attempted to use the identical file on a different SharePoint site. However, the problem persisted.
Can someone please help me to fix this issue?
In SharePoint online word view number bullets start like 1.0,1.1,1.2,…. But in desktop view, it’s like 1.1,1.2,1.3,…. I have tried the below troubleshooting steps. 1. Modified the site’s permission hierarchy to specific user. When the bullets that begin with 1.0, 1.1, 1.2, and so on are granted access to the site by the owner, member, and site administrator. The document is correct in both the desktop and SharePoint views when I remove all site permissions and grant read access.2. Used a new browser(We are normally use Microsoft Edge browser) to visit this same SharePoint location. An identical problem was encountered.3. Attempted to use the identical file on a different SharePoint site. However, the problem persisted. Can someone please help me to fix this issue? Read More
Steps to Security hardening in Windows server
Hello everyone,
I am experience on the Active Directory and others, but I am quite new on server hardening. I am not sure where should I start with, what kind of actions I need to take, what thing I should prepare to let the server done with hardening.
My company is running Windows 2019 servers in major. We have also GPO running in our AD like password expiry, complexity, map network drive, printer, etc. We are going to deploy some new servers as Windows 2022. What I am thinking is I want to harden the server, but just not sure what kind of procedure I can get ready
Regards,
Timothy
Hello everyone,I am experience on the Active Directory and others, but I am quite new on server hardening. I am not sure where should I start with, what kind of actions I need to take, what thing I should prepare to let the server done with hardening.My company is running Windows 2019 servers in major. We have also GPO running in our AD like password expiry, complexity, map network drive, printer, etc. We are going to deploy some new servers as Windows 2022. What I am thinking is I want to harden the server, but just not sure what kind of procedure I can get ready Regards, Timothy Read More
Using SSIS Expression to build query to run on Oracle source
I am trying to build a query to run on oracle source by putting it in a variable, I am pulling max of date into a variable varChildpkgMaxDt(e.g. 2024-08-10 17:21:04.670 ) from SQL server target table.
I need to build a query to run on oracle source. I have used datetime as datatype for varChildpkgMaxDt to fetch from sqlserver in parent pkg and passing this as a parameter to child pkg.
I created a varCpkgQuery variable as string and I am getting syntax err when I build this.
I also want to ensure that the varChildpkgMaxDt is properly compared in oracle with NVL(REVIEWDATE,CREATED) as this two fields in oracle are date with time stamp up till 3mill sec (2024-08-10 17:21:04.670)
“SELECT *
FROM dbo.virDocument
WHERE NVL(REVIEWDATE,CREATED) > “+ (DT_DBTIMESTAMP2,3)@[User::varChildPkgMaxDt]
Using oracle 19 C source, VS 2017 SSDT & SQL Server 2016
I am trying to build a query to run on oracle source by putting it in a variable, I am pulling max of date into a variable varChildpkgMaxDt(e.g. 2024-08-10 17:21:04.670 ) from SQL server target table. I need to build a query to run on oracle source. I have used datetime as datatype for varChildpkgMaxDt to fetch from sqlserver in parent pkg and passing this as a parameter to child pkg. I created a varCpkgQuery variable as string and I am getting syntax err when I build this. I also want to ensure that the varChildpkgMaxDt is properly compared in oracle with NVL(REVIEWDATE,CREATED) as this two fields in oracle are date with time stamp up till 3mill sec (2024-08-10 17:21:04.670) “SELECT *
FROM dbo.virDocument
WHERE NVL(REVIEWDATE,CREATED) > “+ (DT_DBTIMESTAMP2,3)@[User::varChildPkgMaxDt] Using oracle 19 C source, VS 2017 SSDT & SQL Server 2016 Read More
Renew Azure Data Engineer Associate certification
Hi there experts,
I received a renewal email the other day for “Azure Data Engineer Associate” stating:
“Your Microsoft Certified: Azure Data Engineer Associate certification is now eligible for renewal. You have until November 2, 2024 (UTC) to pass the renewal assessment.”
However when I follow the link in the email and log in to my Learn profile the “Take the renewal assessment” link is inactive.
and when I check the Certifications area under my profile there are none listed.
Can someone help?
Cheers
Hi there experts, I received a renewal email the other day for “Azure Data Engineer Associate” stating: “Your Microsoft Certified: Azure Data Engineer Associate certification is now eligible for renewal. You have until November 2, 2024 (UTC) to pass the renewal assessment.” However when I follow the link in the email and log in to my Learn profile the “Take the renewal assessment” link is inactive. and when I check the Certifications area under my profile there are none listed. Can someone help? Cheers Read More
Web development
Hi I am interested in learning how to make websites. What should I do in college?
Hi I am interested in learning how to make websites. What should I do in college? Read More
Step by Step: Integrate Advanced RAG Service with Your Own Data into Copilot Studio
This post is going to explain how to use Advanced RAG Service easily verify proper RAG tech performance for your own data, and integrate it as a service endpoint into Copilot Studio.
This time we use CSV as a sample. CSV is text structure data, when we use basic RAG to process a multiple pages CSV file as Vector Index and perform similarity search using Nature Language on it, the grounded data is always chunked and hardly make LLM to understand the whole data picture.
For example, if we have 10,000 rows in a CSV file, when we ask “how many rows does the data contain and what’s the mean value of the visits column”, usually general semantic search service cannot give exact right answers if it just handles the data as unstructured. We need to use different advanced RAG method to handle the CSV data here.
Thanks to LLamaIndex Pandas Query Engine, which provides a good idea of understanding data frame data through natural language way. However to verify its performance among others and integrate to existing Enterprise environment, such as Copilot Studio or other user facing services, it definitely needs AI service developing experience and takes certain learning curve and time efforts from POC to Production.
Advanced RAG Service supports 6 latest advanced indexing techs including CSV Query Eninge, with it developers can leverage it to shorten development POC stage, and achieve Production purpose. Here is detail step to step guideline:
text-embedding-3-small
a. In Docker environment, run this command to clone the dockerfile and related config sample:
b. In the AdvancedRAG folder, rename .env.sample to .env
mv .env.sample .env
c. In the .env file, configure necessary environment variables. In this tutorial, let’s configure:
AZURE_OPENAI_API_KEY=
AZURE_OPENAI_Deployment=gpt-4o-mini
AZURE_OPENAI_EMBEDDING_Deployment=text-embedding-3-small
AZURE_OPENAI_ENDPOINT=https://[name].openai.azure.com/
# Azure Document Intellenigence
DOC_AI_BASE=https://[name].cognitiveservices.azure.com/
DOC_AI_KEY=
NOTE:
d. Build your own docker image:
e. Run this docker:
f. Access http://localhost:8000/
a. Click the CSV Query Engine tab, upload a test CSV file, click Submit
b. Click the Chat Mode tab, now we can use Natural Language to test how good the CSV Query Engine at understanding CSV content:
The Advanced RAG Service is built with Gradio and FAST API. It opens necessary API Endpoints by default. We can turn off any of them in the .env settings.
The Chat endpoint can be used for different index types query/search. Since we are using “CSV Query Engine”, now it is:
content-type: application/json
{
“data”: [
“how many records does it have”,
“”,
“CSV Query Engine”,
“/tmp/gradio/86262b8036b56db1a2ed40087bbc772f619d0df4/titanic_train.csv”,
“You are a friendly AI Assistant” ,
false
]
}
The response is:
“data”: [
“The dataset contains a total of 891 records. If you have any more questions about the data, feel free to ask!”,
null
],
“is_generating”: true,
“duration”: 3.148253917694092,
“average_duration”: 3.148253917694092,
“render_config”: null,
“changed_state_ids”: []
}
Using this method, we can easily integrate the specific RAG capability to our own service, such as Copilot Studio. Before that, let’s publish the service first.
We have different methods to release docker as an app service. Here are the generate steps when we use Azure Contain Registry and Azure Container App.
a. Create Azure Container Registry resource [ACRNAME], upload your tested docker image to it. The command is:
az account set -s [your subscription]
az acr login -n [ACRNAME]
docker push [ACRNAME].azurecr.io/dockerimage:tag
b. Create an Azure Container App, deploy this docker image, and deploy it. Don’t forget enable Session Affinity for the Container App.
To automate the Azure Container App deployment, I provided deploy_acr_app.sh in the repo.
set -e
if [ $# -eq 0 ]
then
echo “No SUF_FIX supplied, it should be an integer or a short string”
docker image list
exit 1
fi
SUF_FIX=$1
RESOURCE_GROUP=”rg-demo-${SUF_FIX}”
LOCATION=”eastus”
ENVIRONMENT=”env-demo-containerapps”
API_NAME=”advrag-demo-${SUF_FIX}”
FRONTEND_NAME=”advrag-ui-${SUF_FIX}”
TARGET_PORT=8000
ACR_NAME=”advragdemo${SUF_FIX}”
az group create –name $RESOURCE_GROUP –location “$LOCATION”
az acr create –resource-group $RESOURCE_GROUP –name $ACR_NAME –sku Basic –admin-enabled true
az acr build –registry $ACR_NAME –image $API_NAME .
az containerapp env create –name $ENVIRONMENT –resource-group $RESOURCE_GROUP –location “$LOCATION”
az containerapp create –name $API_NAME –resource-group $RESOURCE_GROUP –environment $ENVIRONMENT –image $ACR_NAME.azurecr.io/$API_NAME –target-port $TARGET_PORT –ingress external –registry-server $ACR_NAME.azurecr.io –query properties.configuration.ingress.fqdn
az containerapp ingress sticky-sessions set -n $API_NAME -g $RESOURCE_GROUP –affinity sticky
To use it:
./deploy_acr_azure.sh [suffix number]
Note: for more details about this sh, can refer to this guideline.
After around 7~8 minutes, the Azure Container App will be ready. You can check the output and access it directly:
To protect your container app, can follow this guide to enable authentication on it.
Enable authentication and authorization in Azure Container Apps with Microsoft Entra ID
By default, we need to upload a CSV to the AdvRAG service before analysis. The service always saves the uploaded file to its local temp folder on server side. And then we can use temp file path to start the analysis query.
To skip this step, we can save common files in subfolder rules of the AdvancedRAG folder, and then build your docker image. The files will be copy to the docker itself. As a demo, I can put a CSV file in AdvancedRAG/rules/files, and then pubish the docker to Azure.
a. Open Copilot Studio, create a new Topic, use “CSV Query” to trigger it.
b. For demo purpose, I upload a test CSV file and got its path, then put it into a variable:
c. Now let’s add a Question step to ask what question the user want to ask:
d. Click “+”, “Add an Action”, “Create a flow”. We will use this new flow to call AdvancedRAG service endpoint.
e. We need Query, File_Path, System_Message as input variables.
e. In the flow Editor, let’s add an HTTP step. In the step, post the request to the AdvancedRAG endpoint as below:
Save the flow as ADVRAGSVC_CSV, and publish it.
f. Back to Copilot Studio topic, we will add the action as below, and set input variables as need:
g. Publish and open this Custom Copilot in Teams Channel based on this guide.
h. Now we can test this topic lit this, as we see, even I used gpt-4o-mini here, the response accuracy is very good:
From above, it shows how to quickly verify potential useful RAG techs (Pandas Query Engine) in the AdvancedRAG service studio, expose and publish it as REST API endpoint which can be used by other service, such as Copilot Studio.
The AdvancedRAG service focuses on key logic and stability of different important index types, the efficiency to be landed into M365 AI use cases. For any feature improvement ideas, feel free to visit below repos to create issues, fork projects and create PRs.
Docker Deploy Repo: https://github.com/freistli/AdvancedRAG
Source Code Repo: https://github.com/freistli/AdvancedRAG_SVC
Exploring the Advanced RAG (Retrieval Augmented Generation) Service
Microsoft Tech Community – Latest Blogs –Read More
Sharing a Teams URL
Hi all,
I am hoping you can help.
We are using a Tap IP and Huddle Bar for online meetings. Normally we would receive a meeting invite from an external party and then forward it to the meeting room, and then launch the meeting from there.
Recently we are receiving meeting requests from third parties who are sending a just a copy-paste of the Teams link URL only (i.e., it is not a typical teams meeting invite email with the teams logo etc…).
We have not been able to successfully forward the URL to the Tap IP/Huddle Bar to launch the meeting. Is this a limitation of this setup or is there a specific way these invites should be handled?
Hi all, I am hoping you can help. We are using a Tap IP and Huddle Bar for online meetings. Normally we would receive a meeting invite from an external party and then forward it to the meeting room, and then launch the meeting from there. Recently we are receiving meeting requests from third parties who are sending a just a copy-paste of the Teams link URL only (i.e., it is not a typical teams meeting invite email with the teams logo etc…). We have not been able to successfully forward the URL to the Tap IP/Huddle Bar to launch the meeting. Is this a limitation of this setup or is there a specific way these invites should be handled? Read More
Super user AIP
Hi All,
I have question relate to AIP user.
Few of our users have assigned MIPP protection on excel files and send it over to internal users via an email (not saved on sharepoint nor file share). Users are no more working for the organisation. I would like to know if i would save the file to local PC and open the file as super user (as per the below article) to view the content or reassign protection settings?
https://learn.microsoft.com/en-us/azure/information-protection/configure-super-users
Hi All, I have question relate to AIP user. Few of our users have assigned MIPP protection on excel files and send it over to internal users via an email (not saved on sharepoint nor file share). Users are no more working for the organisation. I would like to know if i would save the file to local PC and open the file as super user (as per the below article) to view the content or reassign protection settings? https://learn.microsoft.com/en-us/azure/information-protection/configure-super-users Read More
Remote Desktop on Win11 Pro toggle unable to turn on
Hello all,
I noticed that the feature of “Remote Desktop” in Settings on Windows 11 Pro edition (version 23H2), is not staying in the “ON” position after trying to toggle it. It displays a message I can confirm it and it switches back to the “OFF” position.
Troubleshooting steps done:
Reset the NIC (which is not the issue)
Taken the workstation off the domain to confirm if it was a domain restriction but will not turn on even as a local admin account (which is not the issue)
I have disabled third-party AV (which is not the issue)
I have disabled the Windows firewall (which is not the issue)
I have allowed the port 3389 (which is not the issue)
And last, there is no GPO policy
It was working before and three days ago unable to RDP to my workstation.
Anybody has run into this issue before and fixed it?
Thanks
Hello all,I noticed that the feature of “Remote Desktop” in Settings on Windows 11 Pro edition (version 23H2), is not staying in the “ON” position after trying to toggle it. It displays a message I can confirm it and it switches back to the “OFF” position.Troubleshooting steps done:Reset the NIC (which is not the issue)Taken the workstation off the domain to confirm if it was a domain restriction but will not turn on even as a local admin account (which is not the issue)I have disabled third-party AV (which is not the issue)I have disabled the Windows firewall (which is not the issue)I have allowed the port 3389 (which is not the issue)And last, there is no GPO policyIt was working before and three days ago unable to RDP to my workstation.Anybody has run into this issue before and fixed it?Thanks Read More
VLOOKUP Issue
Hi,
My apologies as this is a very basic VLOOKUP question but I can’t figure it out. I have data where every person has a research partner. I’m trying to create a Partner_ID variable that contains the ID of the person the participant is partnered with. I’m pretty sure VLOOKUP is the best way to do this but it isn’t working for me. I have put an example with fake data in the screenshot below. Essentially, I want VLOOKUP to search for the name “Cox” in the Table and find the name under the LastName column, then print the ID that is next to it (in this case, cell D2 should print a 3, indicating that Perry’s Partner is ID#3). As you can see, however, Excel is returning error messages that it can’t find the expected value. I’ve looked at various tutorials and I can’t figure out what I’m doing wrong. Thank you!
The formula I’m using is =VLOOKUP(C2,$A$2:$B$7,1,FALSE)
Hi, My apologies as this is a very basic VLOOKUP question but I can’t figure it out. I have data where every person has a research partner. I’m trying to create a Partner_ID variable that contains the ID of the person the participant is partnered with. I’m pretty sure VLOOKUP is the best way to do this but it isn’t working for me. I have put an example with fake data in the screenshot below. Essentially, I want VLOOKUP to search for the name “Cox” in the Table and find the name under the LastName column, then print the ID that is next to it (in this case, cell D2 should print a 3, indicating that Perry’s Partner is ID#3). As you can see, however, Excel is returning error messages that it can’t find the expected value. I’ve looked at various tutorials and I can’t figure out what I’m doing wrong. Thank you! The formula I’m using is =VLOOKUP(C2,$A$2:$B$7,1,FALSE) Read More
Why do I have to have AI?
I want the latest speed and memory of the latest surface pro, but I don’t want anything to do with AI. can I purchase a surface pro without AI?
I want the latest speed and memory of the latest surface pro, but I don’t want anything to do with AI. can I purchase a surface pro without AI? Read More