Month: September 2024
expand a dynamic named range
I work in insurance. I am building a template that can work with our customers who have a random number of health plan options (possibly more than one each of: health plan, dental plan, vision plan). The goal is to do a total plan design cost. There are 4 options with each potential plan choice: Employee Only (EO), Employee Plus Spouse (ESP), Employee Plus Children (ECH), Employee Plus Family (FAM).
I have a dynamic named range that concatenates all health plans, followed by all dental plans, followed by all vision plans, into a single column. This data set can be a minimum of 3 (1 health, 1 dental, 1 vision) to a maximum of who knows (as each type of policy can have many options).
What I need is to create a dynamic table, that has the first column be the plan name (ie, health1,health2,health3,dental1,dental2,vision1,vision2,vision3, etc…) in the first column. The second column would have EO, ESP, ECH, FAM (for the 4 options I listed earlier) for each item in the 1st column, then subsequent columns would have premium data that I can deal with.
The problem I’m having is coming up with a way to create this dynamic table where the health1,health2, etc. stuff is inserted in every 4th cell, so the 2nd column gives the 4 different options EO, ESP, ECH, FAM for each of the entries health1,health2, etc.
Anybody have a suggestion how to expand a dynamic named range so I can have it populate the first column of a dynamic length table every 4th row?
I work in insurance. I am building a template that can work with our customers who have a random number of health plan options (possibly more than one each of: health plan, dental plan, vision plan). The goal is to do a total plan design cost. There are 4 options with each potential plan choice: Employee Only (EO), Employee Plus Spouse (ESP), Employee Plus Children (ECH), Employee Plus Family (FAM). I have a dynamic named range that concatenates all health plans, followed by all dental plans, followed by all vision plans, into a single column. This data set can be a minimum of 3 (1 health, 1 dental, 1 vision) to a maximum of who knows (as each type of policy can have many options). What I need is to create a dynamic table, that has the first column be the plan name (ie, health1,health2,health3,dental1,dental2,vision1,vision2,vision3, etc…) in the first column. The second column would have EO, ESP, ECH, FAM (for the 4 options I listed earlier) for each item in the 1st column, then subsequent columns would have premium data that I can deal with. The problem I’m having is coming up with a way to create this dynamic table where the health1,health2, etc. stuff is inserted in every 4th cell, so the 2nd column gives the 4 different options EO, ESP, ECH, FAM for each of the entries health1,health2, etc. Anybody have a suggestion how to expand a dynamic named range so I can have it populate the first column of a dynamic length table every 4th row? Read More
The Future of AI: Fine-Tuning Llama 3.1 8B on Azure AI Serverless, why it’s so easy & cost efficient
The Future of AI: LLM Distillation just got easier
Part 2 – Fine-Tuning Llama 3.1 8B on Azure AI Serverless
How Azure AI Serverless Fine-tuning, LoRA, RAFT and the AI Python SDK are streamlining fine-tuning of domain specific models. (🚀🔥 Github recipe repo).
By Cedric Vidal, Principal AI Advocate, Microsoft
Part of the Future of AI 🚀 series initiated by Marco Casalaina with his Exploring Multi-Agent AI Systems blog post.
AI-powered engine fine-tuning setup, generated using Azure OpenAI DALL-E 3
In our previous blog post, we explored utilizing Llama 3.1 405B with RAFT to generate a synthetic dataset. Today, you’ll learn how to fine-tune a Llama 3.1 8B model with the dataset you generated. This post will walk you through a simplified fine-tuning process using Azure AI Fine-Tuning as a Service, highlighting its ease of use and cost efficiency. We’ll also explain what LoRA is and why combining RAFT with LoRA provides a unique advantage for efficient and affordable model customization. Finally, we’ll provide practical, step-by-step code examples to help you apply these concepts in your own projects. > The concepts and source code mentioned in this post are fully available in the Github recipe repo.
Azure AI takes the complexity out of the equation. Gone are the days when setting up GPU infrastructure, configuring Python frameworks, and mastering model fine-tuning techniques were necessary hurdles. Azure Serverless Fine-Tuning allows you to bypass the hassle entirely. Simply upload your dataset, adjust a few hyperparameters, and start the fine-tuning process. This ease of use democratizes AI development, making it accessible to a wider range of users and organizations.
Why Azure AI Serverless Fine-Tuning Changes the Game
Fine-tuning a model used to be a daunting task:
Skill Requirements: Proficiency in Python and machine learning frameworks like TensorFlow or PyTorch was essential.
Resource Intensive: Setting up and managing GPU infrastructure required significant investment.
Time-Consuming: The process was often lengthy, from setup to execution.
Azure AI Fine-Tuning as a Service eliminates these barriers by providing an intuitive platform where you can fine-tune models without worrying about the underlying infrastructure. With serverless capabilities, you simply upload your dataset, specify hyperparameters, and hit the “fine-tune” button. This streamlined process allows for quick iterations and experimentation, significantly accelerating AI development cycles.
Llama relaxing in a workshop, generated using Azure OpenAI DALL-E 3
LoRA: A Game-Changer for Efficient Fine-Tuning
What is LoRA?
LoRA (Low-order Rank Adaptation) is an efficient method for fine-tuning large language models. Unlike traditional fine-tuning, which updates all the model’s weights, LoRA modifies only a small fraction of the weights captured in an adapter. This focused approach drastically reduces the time and cost needed for fine-tuning while maintaining the model’s performance.
LoRA in Action
LoRA fine-tunes models by selectively adjusting a small fraction of weights via an adapter, offering several advantages:
Selective Weight Updating: Only a fraction of the weights are fine-tuned, reducing computational requirements.
Cost Efficiency: Lower computational demands translate to reduced operational costs.
Speed: Fine-tuning is faster, enabling quicker deployments and iterations.
Illustration of LoRA Fine-tuning. This diagram shows a single attention block enhanced with LoRA. Each attention block in the model typically incorporates its own LoRA module. SVG diagram generated using Azure OpenAI GPT-4o
Combining RAFT and LoRA: Why It’s So Effective
We’ve seen how Serverless Fine-tuning on Azure AI uses LoRA, which updates only a fraction of the weights of the model and can therefore be so cheap and fast.
With the combination of RAFT and LORA, the model is not taught new fundamental knowledge, indeed it becomes an expert at understanding the domain, focusing its attention on the citations that are the most useful to answer a question but it doesn’t contain all the information about the domain. It is like a librarian (see RAG Hack session on RAFT), a librarian doesn’t know the content of all the books perfectly, but it knows which books contain the answers to a given question.
Another way to look at it is from a standpoint of information theory. Because LoRA only updates a fraction of the weights, there is only so much information you can store in those weights as opposed to full weight fine tuning which updates all the weight bottom to top of the model.
LoRA might look like a limitation but it’s actually perfect when used in combination with RAFT and RAG. You get the best of RAG and fine-tuning. RAG provides access to a potentially infinite amount of reference documents and RAFT with LoRA provides a model which is an expert at understanding the documents retrieved by RAG at a fraction of the cost of full weight fine-tuning.
Azure AI Fine-Tuning API and the Importance of Automating your AI Ops Pipeline
Azure AI empowers developers with serverless fine-tuning via an API, simplifying the integration of fine-tuning processes into automated AI operations (AI Ops) pipelines. Organizations can use the Azure AI Python SDK to further streamline this process, enabling seamless orchestration of model training workflows. This includes systematic data handling, model versioning, and deployment. Automating these processes is crucial as it ensures consistency, reduces human error, and accelerates the entire AI lifecycle—from data preparation, through model training, to deployment and monitoring. By leveraging Azure AI’s serverless fine-tuning API, along with the Python SDK, organizations can maintain an efficient, scalable, and agile AI Ops pipeline, ultimately driving faster innovation and more reliable AI systems.
Addressing Model Drift and Foundation Model Obsolescence
One critical aspect of machine learning, especially in fine-tuning, is ensuring that models generalize well to unseen data. This is the primary purpose of the evaluation phase.
However, as domains evolve and documents are added or updated, models will inevitably begin to drift. The rate of this drift depends on how quickly your domain changes; it could be a month, six months, a year, or even longer.
Therefore, it’s essential to periodically refresh your model and execute the distillation process anew to maintain its performance.
Moreover, the field of AI is dynamic, with new and improved foundational models being released frequently. To leverage these advancements, you should have a streamlined process to re-run distillation on the latest models, enabling you to measure improvements and deploy updates to your users efficiently.
Why Automating the Distillation Process is Essential
Automation in the distillation process is crucial. As new documents are added or existing ones are updated, your model’s alignment with the domain can drift over time. Setting up an automated, end-to-end distillation pipeline ensures that your model remains current and accurate. By regularly re-running the distillation, you can keep the model aligned with the evolving domain, maintaining its reliability and performance.
Practical Steps: Fine-Tuning Llama 3.1 8B with RAFT and LoRA
Now that we’ve explained the benefits, let’s walk through the practical steps using the raft-distillation-recipe repository on GitHub.
If you have not yet run the synthetic data generation phase using RAFT, I invite you to head over the previous article of this blog series.
Once you have your synthetic dataset on hand, you can head over to the finetuning notebook of the distillation recipe repository.
Here are the key snippets of code illustrating how to use the Azure AI Python SDK to upload a dataset, subscribe to the Markerplace offer, create and submit a fine-tuning job on the Azure AI Serverless platform.
Uploading the training dataset
The following code checks if the training dataset already exists in the workspace and uploads it only if needed. It incorporates the hash of the dataset into the filename, facilitating easy detection of whether the file has been previously uploaded.
from azure.ai.ml.entities import Data
dataset_version = “1”
train_dataset_name = f”{ds_name}_train_{train_hash}”
try:
train_data_created = workspace_ml_client.data.get(train_dataset_name, version=dataset_version)
print(f”Dataset {train_dataset_name} already exists”)
except:
print(f”Creating dataset {train_dataset_name}”)
train_data = Data(
path=dataset_path_ft_train,
type=AssetTypes.URI_FILE,
description=f”{ds_name} training dataset”,
name=train_dataset_name,
version=dataset_version,
)
train_data_created = workspace_ml_client.data.create_or_update(train_data)
from azure.ai.ml.entities._inputs_outputs import Input
training_data = Input(
type=train_data_created.type, path=f”azureml://locations/{workspace.location}/workspaces/{workspace._workspace_id}/data/{train_data_created.name}/versions/{train_data_created.version}”
)
Subscribing to the Marketplace offer
This step is only necessary when fine-tuning a model from a third party vendor such as Meta or Mistral. If you’re fine-tuning a Microsoft first party model such as Phi 3 then you can skip this step.
from azure.ai.ml.entities import MarketplaceSubscription
model_id = “/”.join(foundation_model.id.split(“/”)[:-2])
subscription_name = model_id.split(“/”)[-1].replace(“.”, “-“).replace(“_”, “-“)
print(f”Subscribing to Marketplace model: {model_id}”)
from azure.core.exceptions import ResourceExistsError
marketplace_subscription = MarketplaceSubscription(
model_id=model_id,
name=subscription_name,
)
try:
marketplace_subscription = workspace_ml_client.marketplace_subscriptions.begin_create_or_update(marketplace_subscription).result()
except ResourceExistsError as ex:
print(f”Marketplace subscription {subscription_name} already exists for model {model_id}”)
Create the fine tuning job using the the model and data as inputs
finetuning_job = CustomModelFineTuningJob(
task=task,
training_data=training_data,
validation_data=validation_data,
hyperparameters={
“per_device_train_batch_size”: “1”,
“learning_rate”: str(learning_rate),
“num_train_epochs”: “1”,
“registered_model_name”: registered_model_name,
},
model=model_to_finetune,
display_name=job_name,
name=job_name,
experiment_name=experiment_name,
outputs={“registered_model”: Output(type=”mlflow_model”, name=f”ft-job-finetune-registered-{short_guid}”)},
)
Submit the fine-tuning job
The following snippet will submit the previously created fine-tuning job to the Azure AI serverless platform. If the submission is successful, the job details including the Studio URL and the registered model name will be printed. Any errors encountered during the submission will be displayed as well.
try:
print(f”Submitting job {finetuning_job.name}”)
created_job = workspace_ml_client.jobs.create_or_update(finetuning_job)
print(f”Successfully created job {finetuning_job.name}”)
print(f”Studio URL is {created_job.studio_url}”)
print(f”Registered model name will be {registered_model_name}”)
except Exception as e:
print(“Error creating job”, e)
raise e
The full runnable code is available in the previously mentioned finetuning notebook.
Join the Conversation
We invite you to join our tech community on Discord to discuss fine-tuning techniques, RAFT, LoRA, and more. Whether you’re a seasoned AI developer or just starting, our community is here to support you. Share your experiences, ask questions, and collaborate with fellow AI enthusiasts. Join us on Discord and be part of the conversation!
What’s next?
This concludes the second installment of our blog series on fine-tuning the Llama 3.1 8B model with RAFT and LoRA, harnessing the capabilities of Azure AI Serverless Fine-Tuning. Today, we’ve shown how these advanced technologies enable efficient and cost-effective model customization that precisely meets your domain needs.
By integrating RAFT and LoRA, you can transform your models into specialists that effectively navigate and interpret relevant information from extensive document repositories using RAG, all while significantly cutting down on the time and costs associated with full weight fine-tuning. This methodology accelerates the fine-tuning process and democratizes access to advanced AI capabilities.
With the detailed steps and code snippets provided, you now have the tools to implement serverless fine-tuning within your AI development workflow. Leveraging automation in AI Ops will help you maintain and optimize model performance over time, keeping your AI solutions competitive in an ever-changing environment.
Stay tuned! In two weeks, we’ll dive into the next topic: deploying our fine-tuned models.
Microsoft Tech Community – Latest Blogs –Read More
I dont understand how to add custom config to my app with android
Hi there
I am struggling to understand how I can add custom config to my app on android so that when users download the app from intune, the config automatically is passed with the installation.
The config is a key/value pair:
qrCode: “{“applicationName”:”LoremIpsum”,”baseUrl”:”https://LoremIpsum.com.au/01/api/mobile/“}”
acceptTermsAndConditions: true
The qrCode value is a string
The acceptTermsAndConditions value is a boolean
I have managed to get this working with ios very easily. Ios app config provides a nice UI to add key/value pair (first image). Android (second image) however does not provide this UI to add config and I dont understand why. Appreciate if someone can tell me how to get android config to be entered like ios that would be amazing.
Hi there I am struggling to understand how I can add custom config to my app on android so that when users download the app from intune, the config automatically is passed with the installation. The config is a key/value pair:qrCode: “{“applicationName”:”LoremIpsum”,”baseUrl”:”https://LoremIpsum.com.au/01/api/mobile/”}”acceptTermsAndConditions: true The qrCode value is a stringThe acceptTermsAndConditions value is a boolean I have managed to get this working with ios very easily. Ios app config provides a nice UI to add key/value pair (first image). Android (second image) however does not provide this UI to add config and I dont understand why. Appreciate if someone can tell me how to get android config to be entered like ios that would be amazing. Read More
Desktop support enrolling Autopilot devices – DeviceCapReached error
We’re currently in the middle of a quarterly equipment lease swap and have had a couple of people on our team getting the DeviceCapReached error when we go to enroll an Autopilot device. This is happening because we’re enrolling the devices with our accounts, rather than having the user sign in, then taking the laptop back from them to put it in the right on prem OU, run updates and install all of the software they need. I understand this isn’t how Microsoft designed Autopilot to work, but this is where we’re at.
I’ve done research into potential resolutions, but I have a lot of questions. First, some important details
User-driven deployment profile (future proofing, I guess)Microsoft Entra hybrid enrollmentIntune device enrollment limit – 7Azure tenant device limit – 20
The first option seems to be creating a script that clears out stale devices from our Azure tenant. When I’ve spoken with our Infrastructure team about device removal in the past, they said we’re using Entra Connect to sync with on prem AD, so they we’re against the idea. I’ve found a way to convince them otherwise, but it’s going to take time and scripting.
The next option is using a device enrollment manager account, but the Microsoft documentation mentions it enrolls the device in shared mode and that device limits won’t work on devices enrolled this way. It also says “Do not delete accounts assigned as a Device enrollment manager if any devices were enrolled using the account. Doing so will lead to issues with these devices.” but doesn’t elaborate further. So, this option seems like a dead end.
Third option is to increase the device enrollment quota in Azure, but since this is a tenant wide setting, we don’t necessarily want to give Rick in accounting the ability to enroll as many devices as he can carry.
I found a comment in this thread that suggested using Remove-AzureADDeviceRegisteredOwner (now Remove-MgDeviceRegisteredOwnerByRef with the graph modules). But this just change the primary user. Doing so didn’t stop me from getting the error message.
So here are my questions –
If you’ve gone through this, how did you resolve the issue?
What exactly are the consequences of using a DEM account to enroll devices?
If I look at the devices attached to my user account, and filter by Autopilot devices, I have 42. Other offices have a single desktop person, and they have > 80 devices. What device property, in which directory, causes this error?
Do you have a stale device script you’d recommend? I’ll write my own, for sure, but having something to go off of would be nice
We’re currently in the middle of a quarterly equipment lease swap and have had a couple of people on our team getting the DeviceCapReached error when we go to enroll an Autopilot device. This is happening because we’re enrolling the devices with our accounts, rather than having the user sign in, then taking the laptop back from them to put it in the right on prem OU, run updates and install all of the software they need. I understand this isn’t how Microsoft designed Autopilot to work, but this is where we’re at. I’ve done research into potential resolutions, but I have a lot of questions. First, some important detailsUser-driven deployment profile (future proofing, I guess)Microsoft Entra hybrid enrollmentIntune device enrollment limit – 7Azure tenant device limit – 20The first option seems to be creating a script that clears out stale devices from our Azure tenant. When I’ve spoken with our Infrastructure team about device removal in the past, they said we’re using Entra Connect to sync with on prem AD, so they we’re against the idea. I’ve found a way to convince them otherwise, but it’s going to take time and scripting. The next option is using a device enrollment manager account, but the Microsoft documentation mentions it enrolls the device in shared mode and that device limits won’t work on devices enrolled this way. It also says “Do not delete accounts assigned as a Device enrollment manager if any devices were enrolled using the account. Doing so will lead to issues with these devices.” but doesn’t elaborate further. So, this option seems like a dead end. Third option is to increase the device enrollment quota in Azure, but since this is a tenant wide setting, we don’t necessarily want to give Rick in accounting the ability to enroll as many devices as he can carry. I found a comment in this thread that suggested using Remove-AzureADDeviceRegisteredOwner (now Remove-MgDeviceRegisteredOwnerByRef with the graph modules). But this just change the primary user. Doing so didn’t stop me from getting the error message. So here are my questions -If you’ve gone through this, how did you resolve the issue?What exactly are the consequences of using a DEM account to enroll devices?If I look at the devices attached to my user account, and filter by Autopilot devices, I have 42. Other offices have a single desktop person, and they have > 80 devices. What device property, in which directory, causes this error?Do you have a stale device script you’d recommend? I’ll write my own, for sure, but having something to go off of would be nice Read More
MS Form multiple images
I created a simple feedback questionnaire using MS Forms. The image I selected not only shows in the background, but is also shown as a duplicate on the front image. It doesn’t matter what Customized layout I use.
Is it possible to have 2 different images on an MS Form? One on the background and the front one changed?
I created a simple feedback questionnaire using MS Forms. The image I selected not only shows in the background, but is also shown as a duplicate on the front image. It doesn’t matter what Customized layout I use. Is it possible to have 2 different images on an MS Form? One on the background and the front one changed? Read More
Partner Case Study Series | Dynamica Google Maps Integration brings Google Maps to Dynamics
Adaptability meets quality in Dynamica Labs
Dynamica Labs is a Microsoft Gold Partner whose team has pursued customer relationship management (CRM) development for more than 14 years. Its goal is to deliver the highest quality at an affordable rate. It accomplishes that using a hybrid delivery model: The company is based in London, UK, with a nearshore development center in Eastern Europe. Its expertise includes industrial and pharmaceutical distribution, professional services, commercial real estate, high-tech companies, and IT support services.
The company’s Dynamica Google Maps Integration solution, available on Microsoft AppSource, is a records location tool that is ideal for real estate and distribution companies.
“Google Maps is the leading geospatial database, providing a lot of tools and information for businesses,” said Igor Sarov, CEO at Dynamica Labs. “The Dynamica Google Maps Integration solution combines the power of the Microsoft Dynamics platform and the largest geodatabase. This makes it a perfect tool for companies that do daily route planning, and companies that want to quickly assess a building or location.”
Continue reading here
**Explore all case studies or submit your own**
Microsoft Tech Community – Latest Blogs –Read More
Upload file to OneDrive folder via php /which product do I / my customer need?
Hi,
I’m totally new to this topic.
I’m about to develop an app (Backend:PHP, Frontend:JS) which should be able to push files (generated in php on my server) to an OneDrive Folder of my customer. I am stuck at which product do I (for developing) and later my customer need for that.
I think I need the “client credentials flow”, which MS product/account is needed for that?
Is an onedrive account enough?
Do I/my customer need an additional azure account to register the app?
Thanks for help
Hi,I’m totally new to this topic.I’m about to develop an app (Backend:PHP, Frontend:JS) which should be able to push files (generated in php on my server) to an OneDrive Folder of my customer. I am stuck at which product do I (for developing) and later my customer need for that.I think I need the “client credentials flow”, which MS product/account is needed for that?Is an onedrive account enough?Do I/my customer need an additional azure account to register the app?Thanks for help Read More
Mail merge
Does anyone here have experience with merging Excel data into Word documents? I had one of those days with Office, I would sooner forget. I have an XLM file – not too heavy, but with OnCalculate VBA – which used to supply data into Word without any problems when we were still using Dropbox and pre-365 environment. Since moving to O365 and SPO, the Word will happily take 15 minutes over opening the document.
There is some weird trick by which I can force Word to give up on DDE and switch to something called OLEDB. It brings the time down to 1 mins – still pretty rubbish. I talked to my new friend ChatGBT about it and I now know that I am in trouble because, after initially coming up with many helpful suggestions, it in the end referred me to Microsoft Support … 😞
One of its suggestions was to rebuild the Word doc from scratch. That wasn’t a big deal because the doc in question is a one-pager. A bit of CTRL-C/CTRL-V got the job done in no time. It also suggested that I move the Excel off SPO. I saved a copy into Downloads before making a new connection from the new doc. But here is the weirdest thing: when the OLEDB connections eventually resolved – we are talking here no less than 5 mins – the fields in Word did not contain the values from the open Excel. The values that displayed in the Recipient list were values from the file on SPO. Not that Word would actually populate the preview mailings.
Does this ring any bells with anyone? Thanks.
Does anyone here have experience with merging Excel data into Word documents? I had one of those days with Office, I would sooner forget. I have an XLM file – not too heavy, but with OnCalculate VBA – which used to supply data into Word without any problems when we were still using Dropbox and pre-365 environment. Since moving to O365 and SPO, the Word will happily take 15 minutes over opening the document. There is some weird trick by which I can force Word to give up on DDE and switch to something called OLEDB. It brings the time down to 1 mins – still pretty rubbish. I talked to my new friend ChatGBT about it and I now know that I am in trouble because, after initially coming up with many helpful suggestions, it in the end referred me to Microsoft Support … 😞 One of its suggestions was to rebuild the Word doc from scratch. That wasn’t a big deal because the doc in question is a one-pager. A bit of CTRL-C/CTRL-V got the job done in no time. It also suggested that I move the Excel off SPO. I saved a copy into Downloads before making a new connection from the new doc. But here is the weirdest thing: when the OLEDB connections eventually resolved – we are talking here no less than 5 mins – the fields in Word did not contain the values from the open Excel. The values that displayed in the Recipient list were values from the file on SPO. Not that Word would actually populate the preview mailings. Does this ring any bells with anyone? Thanks. Read More
Ajio Invite Code? KAR1CY40K (Claim Exclusive Now)
Ajio Invite code is KAR1CY40K, using this invite code you can claim an exclusive bonus & extra discount on your purchase in Ajio. You can also use this bonus at the time of shopping for any product from Ajio. Also share your Invite code with your friend to earn upto Rs.2000. Ajio is one of India’s fastest-growing shopping platforms offering lightning-fast product delivery and original quality experience to its users.
What is Ajio Invite Code?
KAR1CY40K is Ajio app Invite code. By applying Invite code you will get the best signup bonus upto Rs.1500. You can earn up to Rs.2000 on sharing your invite code with your friends. Ajio offers Rs.250 extra off on purchase from the store using KAR1CY40K code.
Ajio Invite Code 2024
App Name
Ajio
Ajio Invite Code
KAR1CY40K
Sign Up Rewards
Exclusive Bonus
Per Invite
Rs.2000
Cashback
Rs.1550
Ajio Invite code is KAR1CY40K, using this invite code you can claim an exclusive bonus & extra discount on your purchase in Ajio. You can also use this bonus at the time of shopping for any product from Ajio. Also share your Invite code with your friend to earn upto Rs.2000. Ajio is one of India’s fastest-growing shopping platforms offering lightning-fast product delivery and original quality experience to its users. What is Ajio Invite Code?KAR1CY40K is Ajio app Invite code. By applying Invite code you will get the best signup bonus upto Rs.1500. You can earn up to Rs.2000 on sharing your invite code with your friends. Ajio offers Rs.250 extra off on purchase from the store using KAR1CY40K code.Ajio Invite Code 2024App NameAjioAjio Invite CodeKAR1CY40KSign Up RewardsExclusive BonusPer InviteRs.2000Cashback Rs.1550 Read More
Per App Content Filter on iOS
I am testing Per App Content Filter(iOS 16 onwards) feature for iOS. Per App Content Filter entitlements can run on a managed device only. Hence these entitlements must be pushed through MDM
Apple documentation on
https://developer.apple.com/documentation/networkextension/content_filter_providers?language=objc
So far research on Intune concluded that Intune does not support it like it supports per app VPN.
Then I tried pushing content filter profile as custom profile and ContentFilterUUID as App configuration policy by targeting it to 3rd party app. Content filter gets pushed but it does not get mapped to 3rd party app.So it does not run until mapping is appropriate and remain in invalid state.
Can anyone help me how can I achieve it on Intune?
Side Note: JAMF provides this built in like per app vpn and I could see payload(from iOS sys logs) is like below
NESMFilterSession[Content Filter 16 May 2024:5F0ABFF4-5414-40D4-AD95-AE207D890720]: handling configuration changed: {
name = <26-char-str>
identifier = 5F0ABFF4-5414-40D4-AD95-AE207D890720
externalIdentifier = <36-char-str>
application = com.test.ent.app
grade = 1
contentFilter = {
enabled = YES
provider = {
pluginType = com.test.ent.app
organization = <7-char-str>
filterBrowsers = NO
filterPackets = NO
filterSockets = YES
disableDefaultDrop = NO
preserveExistingConnections = NO
}
filter-grade = 1
per-app = {
appRules = (
{
matchSigningIdentifier = org.mozilla.ios.Firefox
noDivertDNS = NO
},
)
excludedDomains = ()
}
}
payloadInfo = {
payloadUUID = FC494E29-90AE-4C56-B57A-2E501A17553A
payloadOrganization = <13-char-str>
profileUUID = C2074E3F-39F1-4A48-B979-FE13C0FBC779
profileIdentifier = <36-char-str>
isSetAside = NO
profileIngestionDate = 2024-08-16 21:30:23 +0000
systemVersion = Version 17.5.1 (Build 21F90)
profileSource = mdm
}
}
I am testing Per App Content Filter(iOS 16 onwards) feature for iOS. Per App Content Filter entitlements can run on a managed device only. Hence these entitlements must be pushed through MDMApple documentation on https://developer.apple.com/documentation/technotes/tn3134-network-extension-provider-deployment?language=objchttps://developer.apple.com/documentation/networkextension/content_filter_providers?language=objc So far research on Intune concluded that Intune does not support it like it supports per app VPN.Then I tried pushing content filter profile as custom profile and ContentFilterUUID as App configuration policy by targeting it to 3rd party app. Content filter gets pushed but it does not get mapped to 3rd party app.So it does not run until mapping is appropriate and remain in invalid state. Can anyone help me how can I achieve it on Intune? Side Note: JAMF provides this built in like per app vpn and I could see payload(from iOS sys logs) is like below NESMFilterSession[Content Filter 16 May 2024:5F0ABFF4-5414-40D4-AD95-AE207D890720]: handling configuration changed: {
name = <26-char-str>
identifier = 5F0ABFF4-5414-40D4-AD95-AE207D890720
externalIdentifier = <36-char-str>
application = com.test.ent.app
grade = 1
contentFilter = {
enabled = YES
provider = {
pluginType = com.test.ent.app
organization = <7-char-str>
filterBrowsers = NO
filterPackets = NO
filterSockets = YES
disableDefaultDrop = NO
preserveExistingConnections = NO
}
filter-grade = 1
per-app = {
appRules = (
{
matchSigningIdentifier = org.mozilla.ios.Firefox
noDivertDNS = NO
},
)
excludedDomains = ()
}
}
payloadInfo = {
payloadUUID = FC494E29-90AE-4C56-B57A-2E501A17553A
payloadOrganization = <13-char-str>
profileUUID = C2074E3F-39F1-4A48-B979-FE13C0FBC779
profileIdentifier = <36-char-str>
isSetAside = NO
profileIngestionDate = 2024-08-16 21:30:23 +0000
systemVersion = Version 17.5.1 (Build 21F90)
profileSource = mdm
}
} Read More
Using a function in COUNTIF criteria
Is it possible to use a function inside COUNTIF criteria so that it is applied to each cell in the range under test? For instance, I have column A of text values and want to count cells in it with the text longer than 8. I’m trying to write something like =COUNTIF(A:A,LEN(A1)&”>8″), but it doesn’t work. What is a correct syntax if possible?
Is it possible to use a function inside COUNTIF criteria so that it is applied to each cell in the range under test? For instance, I have column A of text values and want to count cells in it with the text longer than 8. I’m trying to write something like =COUNTIF(A:A,LEN(A1)&”>8″), but it doesn’t work. What is a correct syntax if possible? Read More
Brand Center Options Grayed Out (Already have Organizational Assets Library)
I’m trying to activate the Brand Center in Org Settings, but everything is grayed out. I’m a global admin and we already use an Organizational Assets Library, which it detects. According to the documentation, it’s supposed to prompt to enable a public CDN and then we’re good to go, but that isn’t happening.
Anyone run into this? Is there some kind of prereq that hasn’t been met maybe?
I’m trying to activate the Brand Center in Org Settings, but everything is grayed out. I’m a global admin and we already use an Organizational Assets Library, which it detects. According to the documentation, it’s supposed to prompt to enable a public CDN and then we’re good to go, but that isn’t happening. Anyone run into this? Is there some kind of prereq that hasn’t been met maybe? Read More
CALCULATING 24 HOUR TIME TO THE NEAREST QUARTER HOUR
I bill clients and pay contractors to the nearest quarter hour. I cannot seem to put together a formula that will do the job.
say start time is 0800 hrs stop time 1215 hrs. the formula (B-A)/100 returns 4.15 and not the 4.25 needed to calculate pay at the appropriate rate
but if start time is 0730 hrs and stop time is 1715 the formula (B-A)/100 returns 9.85 and not 9.75 needed
is there a step function i can incorporate? or do I have to create a table and use a look up function? perhaps something simple I haven’t thought of?
I bill clients and pay contractors to the nearest quarter hour. I cannot seem to put together a formula that will do the job. say start time is 0800 hrs stop time 1215 hrs. the formula (B-A)/100 returns 4.15 and not the 4.25 needed to calculate pay at the appropriate rate but if start time is 0730 hrs and stop time is 1715 the formula (B-A)/100 returns 9.85 and not 9.75 neededis there a step function i can incorporate? or do I have to create a table and use a look up function? perhaps something simple I haven’t thought of? Read More
Announcing Global Provisioned Managed Deployments for Scaling Azure OpenAI Service Workloads
We’re excited to announce a major advancement in AI deployments with Azure OpenAI Service: Global Provisioned Managed Deployments, now Generally Available (GA) as of September 18, 2024. This launch marks a significant milestone in our commitment to making AI more accessible, scalable, and flexible for customers worldwide, building on our August release of Provisioned Throughput Units (PTU) for self-service regional deployments.
What is Global Provisioned Managed?
Global Provisioned Managed is a new deployment type within the Azure OpenAI Service that leverages Azure’s global infrastructure to serve provisioned traffic more efficiently. It supports the latest GPT-4o (2024-08-06) and GPT-4o-mini (2024-07-18) models, making them accessible to customers without the limitations of region-specific quotas or capacities. This new deployment model empowers customers to extend AI capabilities to any corner of the globe, providing greater flexibility and speed in deploying models.
Dual Availability: Global and Regional
We are also pleased to announce that the GPT-4o (2024-08-06) model is now available not only through Global Provisioned Managed deployments but also for Provisioned Regional Deployments via self-service. This means customers have the flexibility to choose between a globally managed deployment model or a more controlled, region-specific deployment approach, depending on their specific needs and preferences.
Key Benefits of Global Provisioned Managed Deployments
Access to the Latest Models Everywhere: The Global Provisioned Managed deployment model removes regional limitations, allowing customers to access the newest AI models like GPT-4o and GPT-4o-mini across all supported Azure regions, including eastus, westeurope, japaneast, and more.
Simplified Deployment and Management: Unlike traditional deployment approaches, Global Provisioned Managed decouples capacity management from specific regions, granting automatic access to the new global quota for all eligible customers.
Data Residency and Compliance Flexibility: While API traffic may be processed globally, all customer data is securely stored in the Azure OpenAI Service resource’s region, ensuring adherence to regional data residency and compliance requirements.
Transparent and Flexible Pricing: Billing for Global Provisioned Managed follows the same model as existing Provisioned Managed deployments, ensuring predictable costs with options for hourly pricing and reservations to accommodate diverse usage scenarios.
Dual Deployment Options for Greater Flexibility: The availability of the GPT-4o model for both Global Provisioned Managed and Provisioned Regional Deployments gives customers the freedom to choose the most suitable deployment strategy for their organizational needs.
Why Choose Global Provisioned Managed?
This new deployment type represents a significant evolution in our approach to AI, offering:
Global Reach: Deploy AI models anywhere without the constraints of regional quotas or capacities.
Cost Efficiency: Benefit from cost management options, including monthly and yearly reservations.
Enhanced Flexibility: Deploy and scale AI solutions faster with less complexity and administrative burden, allowing you to focus more on innovation.
Regional Control: For customers needing specific regional deployments, the GPT-4o model remains available through self-service, enabling full control over capacity management.
How to Get Started
Deploying your AI models globally or regionally is simple:
For Global Provisioned Managed Deployments: This option will be available in your Azure OpenAI Service regional resources starting September 18, 2024. To use it, create or select an existing regional resource, and choose the Global Provisioned Managed deployment option.
For Provisioned Regional Deployments: The GPT-4o (2024-08-06) model is available for self-service regional deployments, giving you flexibility to manage regional capacities and resources according to your needs.
Looking Ahead: More Models and Regions
Our initial rollout of Global Provisioned Managed includes support for the GPT-4o and GPT-4o-mini models, with plans to expand the availability of more models under this deployment type. For those requiring specific regional support, the existing Provisioned Managed deployment remains available.
Embrace the Future of AI with Azure OpenAI Service
Azure OpenAI Service is committed to pushing the boundaries of AI capabilities. With the new Global Provisioned Managed deployments, we’re breaking down barriers, providing more flexibility, and ensuring our customers can fully leverage AI’s potential anywhere in the world.
Learn More:
Azure Pricing Provisioned Reservations
Azure OpenAI Service Pricing
More details about Provisioned Throughput
Documentation for Onboarding
PTU Calculator in Azure AI Studio
Unveiling Azure OpenAI Service Provisioned Reservations blog
Data, Privacy, and Security for Azure OpenAI Service
Microsoft Tech Community – Latest Blogs –Read More
Creating an Oulook distribution list, my first, LOST all the addresses in Outlook Contact
Created a list of addresses using instructions from MS. Add then saved. Now all I get for both icons People and Groups is 1 address and the group. I had MANY addresses in the People folder. What happened. The group does hold the ones I selected for the group.
Created a list of addresses using instructions from MS. Add then saved. Now all I get for both icons People and Groups is 1 address and the group. I had MANY addresses in the People folder. What happened. The group does hold the ones I selected for the group. Read More
How to Free Up Storage in SharePoint Online Using Intelligent Versioning and PowerShell
So I’ve gotten more and more questions from users and client how to free up storage in their enviroment and I didn’t see and good post about how to do it, so I wrote a blogppost 🙂
Introduction
Freeing Storage in SharePoint Online Using Intelligent Versioning and PowerShell Automation
As organizations increasingly adopt SharePoint Online for collaboration and document management, managing storage efficiently becomes a critical challenge. One significant contributor to storage consumption is SharePoint’s versioning feature, which creates multiple versions of documents, leading to storage bloat over time.
In this post, we’ll discuss how to intelligently manage file versioning in SharePoint Online and automate the cleanup of old versions using PowerShell. By the end, you’ll have a clear strategy for maintaining healthy storage and managing SharePoint’s powerful versioning features effectively.
Understanding SharePoint Online Versioning
Versioning in SharePoint Online allows users to save multiple copies of documents as they get revised. This is essential for tracking changes, restoring earlier versions, and enabling collaboration. However, if not managed properly, the version history of documents can accumulate rapidly, consuming valuable storage space.
By default, SharePoint retains all versions of a document. Over time, this can lead to excessive storage usage, increased costs, and slower performance. This is where Intelligent Versioning becomes a useful strategy.
What is Intelligent Versioning?
Intelligent Versioning refers to a smart strategy that helps balance retaining valuable document history while freeing up unnecessary storage. Rather than keeping all document versions indefinitely, Intelligent Versioning helps ensure that only recent and important versions are retained, while older versions are deleted to conserve space.
This approach helps you optimize your SharePoint storage while preserving important data.
How Intelligent Versioning Works
Intelligent Versioning introduces a feature called automatic version thinning, which gradually trims older document versions while preserving key timestamps. Here’s how it works:
First 30 Days: All versions are kept.
From 30 to 180 Days: Hourly or daily versions are stored.
Beyond 180 Days: Weekly versions are kept indefinitely.
By using this automatic thinning approach, Microsoft reports that you can reduce version storage usage by up to 96% in some cases, without compromising users’ ability to access important historical versions.
My personal experience, I’ve seen storage saved 30-60% from various clients!
Benefits of Intelligent Versioning
Optimize Storage Costs: By deleting outdated versions, you reduce storage consumption, keeping within limits and potentially lowering costs.
Improve Performance: With fewer document versions stored, searches and retrieval times are faster.
Maintain Data Integrity: Intelligent Versioning ensures you delete only unnecessary versions while keeping critical ones intact.
How to Enable Intelligent Versioning
To enable Intelligent Versioning across your SharePoint Online tenant, use the following PowerShell command:
# Connect to SharePoint Online
$adminSiteUrl = “https://<tenant>-admin.sharepoint.com”
Connect-SPOService -Url $adminSiteUrl
# Enable Intelligent versioning
Set-SPOTenant -EnableVersionExpirationSetting $true
PnP Powershell:
# Install and Import PnP PowerShell module if not installed
# Install-Module -Name “PnP.PowerShell”
# Connect to SharePoint Online Admin Center
$adminSiteUrl = “https://<tenant>-admin.sharepoint.com”
Connect-PnPOnline -Url $adminSiteUrl -Interactive
# Enable Intelligent Versioning
Set-PnPTenant -EnableVersionExpirationSetting $true
Once enabled, this will apply Intelligent Versioning to all new files in SharePoint document libraries. You can verify this setting in the Admin Center under Versioning Settings.
Automating Version Cleanup Using PowerShell
While Intelligent Versioning applies to new documents, older files require manual intervention. Managing versions manually across multiple sites can be a tedious task. Fortunately, PowerShell allows you to automate the process.
Using the New-SPOSiteFileVersionBatchDeleteJob cmdlet, you can create batch deletion jobs to remove document versions older than a specified date.
Prerequisites:
SharePoint Online Administrator permissions.
SharePoint Online Management Shell installed.
PowerShell Script: Automating Version Cleanup
Here’s a PowerShell script that retrieves all site collections in your SharePoint Online tenant and creates a batch delete job for document versions older than 365 days:
# Connect to SharePoint Online
$adminSiteUrl = “https://<tenant>-admin.sharepoint.com”
Connect-SPOService -Url $adminSiteUrl
# Get all site collections
$siteCollections = Get-SPOSite -Limit All
# Loop through each site collection
foreach ($site in $siteCollections) {
Write-Host “Processing site: $($site.Url)”
# Create a batch delete job for versions older than 365 days
try {
New-SPOSiteFileVersionBatchDeleteJob -Identity $site.Url -DeleteBeforeDays 365 -confirm:$false
Write-Host “Batch delete job created for site: $($site.Url)”
}
catch {
Write-Host “Error creating batch delete job for site: $($site.Url)”
Write-Host $_.Exception.Message
}
}
# Disconnect from SharePoint Online
Disconnect-SPOService
PnP Powershell:
# Install the PnP PowerShell module if not already installed
# Install-Module -Name “PnP.PowerShell”
# Connect to SharePoint Online
$adminSiteUrl = “https://<tenant>-admin.sharepoint.com”
Connect-PnPOnline -Url $adminSiteUrl -Interactive
# Get all site collections
$siteCollections = Get-PnPTenantSite -IncludeOneDriveSites -Limit All
# Loop through each site collection
foreach ($site in $siteCollections) {
Write-Host “Processing site: $($site.Url)”
try {
# Create a batch delete job for versions older than 365 days
New-PnPSiteFileVersionBatchDeleteJob -SiteUrl $site.Url -LastRetainedVersionDate 365 -confirm:$false
Write-Host “Batch delete job created for site: $($site.Url)”
}
catch {
Write-Host “Error creating batch delete job for site: $($site.Url)”
Write-Host $_.Exception.Message
}
}
# Disconnect from SharePoint Online
Disconnect-PnPOnline
How This Script Works
Connect to SharePoint Online: The script first connects to your SharePoint Online tenant using the Admin Center URL.
Retrieve All Site Collections: It retrieves all site collections using the Get-SPOSite cmdlet.
Set Retention Date: The script calculates the retention date as 365 days before the current date.
Create Batch Delete Job: The New-SPOSiteFileVersionBatchDeleteJob cmdlet creates a batch job to delete versions older the days you’ve set.
Error Handling: If an error occurs, it logs the error and continues to the next site collection.
Best Practices for Intelligent Versioning and Cleanup
Define a Versioning Policy: Decide how many versions you want to retain for different types of documents. This can be configured in library settings or using PowerShell (e.g., -MajorVersionLimit X). I recommend having it in automatic!
Automate Version Cleanup: Use automation to regularly clean up old versions, minimizing the need for manual intervention.
Monitor Storage Usage: Regularly review storage reports from the SharePoint Admin Center to identify areas where storage is being consumed excessively. Make use of SharePoint Archive and Backup for data that isn’t needed on regular basis
Combine with Retention Policies: Protect critical files by applying Retention Labels or Retention Policies to ensure they are not deleted prematurely, especially for legal or compliance reasons.
Conclusion
As your organization grows, efficiently managing your SharePoint Online storage is crucial. Intelligent Versioning helps you retain the benefits of version control while optimizing storage and keeping costs under control.
By using PowerShell to automate version cleanup, you can streamline the process, ensuring that unnecessary versions are deleted without impacting important datam This way you can clean up old data and know that your future data isn’t being eaten up!
https://yourmodernworkplace.com/blog/Free-Storage-In-SharePoint-Online-Using-Intelligent-Versioning
So I’ve gotten more and more questions from users and client how to free up storage in their enviroment and I didn’t see and good post about how to do it, so I wrote a blogppost 🙂
Introduction
Freeing Storage in SharePoint Online Using Intelligent Versioning and PowerShell Automation
As organizations increasingly adopt SharePoint Online for collaboration and document management, managing storage efficiently becomes a critical challenge. One significant contributor to storage consumption is SharePoint’s versioning feature, which creates multiple versions of documents, leading to storage bloat over time.
In this post, we’ll discuss how to intelligently manage file versioning in SharePoint Online and automate the cleanup of old versions using PowerShell. By the end, you’ll have a clear strategy for maintaining healthy storage and managing SharePoint’s powerful versioning features effectively.
Understanding SharePoint Online Versioning
Versioning in SharePoint Online allows users to save multiple copies of documents as they get revised. This is essential for tracking changes, restoring earlier versions, and enabling collaboration. However, if not managed properly, the version history of documents can accumulate rapidly, consuming valuable storage space.
By default, SharePoint retains all versions of a document. Over time, this can lead to excessive storage usage, increased costs, and slower performance. This is where Intelligent Versioning becomes a useful strategy.
What is Intelligent Versioning?
Intelligent Versioning refers to a smart strategy that helps balance retaining valuable document history while freeing up unnecessary storage. Rather than keeping all document versions indefinitely, Intelligent Versioning helps ensure that only recent and important versions are retained, while older versions are deleted to conserve space.
This approach helps you optimize your SharePoint storage while preserving important data.
How Intelligent Versioning Works
Intelligent Versioning introduces a feature called automatic version thinning, which gradually trims older document versions while preserving key timestamps. Here’s how it works:
First 30 Days: All versions are kept.
From 30 to 180 Days: Hourly or daily versions are stored.
Beyond 180 Days: Weekly versions are kept indefinitely.
By using this automatic thinning approach, Microsoft reports that you can reduce version storage usage by up to 96% in some cases, without compromising users’ ability to access important historical versions.
My personal experience, I’ve seen storage saved 30-60% from various clients!
Benefits of Intelligent Versioning
Optimize Storage Costs: By deleting outdated versions, you reduce storage consumption, keeping within limits and potentially lowering costs.
Improve Performance: With fewer document versions stored, searches and retrieval times are faster.
Maintain Data Integrity: Intelligent Versioning ensures you delete only unnecessary versions while keeping critical ones intact.
How to Enable Intelligent Versioning
To enable Intelligent Versioning across your SharePoint Online tenant, use the following PowerShell command:
# Connect to SharePoint Online
$adminSiteUrl = “https://<tenant>-admin.sharepoint.com”
Connect-SPOService -Url $adminSiteUrl
# Enable Intelligent versioning
Set-SPOTenant -EnableVersionExpirationSetting $true
PnP Powershell:
# Install and Import PnP PowerShell module if not installed
# Install-Module -Name “PnP.PowerShell”
# Connect to SharePoint Online Admin Center
$adminSiteUrl = “https://<tenant>-admin.sharepoint.com”
Connect-PnPOnline -Url $adminSiteUrl -Interactive
# Enable Intelligent Versioning
Set-PnPTenant -EnableVersionExpirationSetting $true
Once enabled, this will apply Intelligent Versioning to all new files in SharePoint document libraries. You can verify this setting in the Admin Center under Versioning Settings.
Automating Version Cleanup Using PowerShell
While Intelligent Versioning applies to new documents, older files require manual intervention. Managing versions manually across multiple sites can be a tedious task. Fortunately, PowerShell allows you to automate the process.
Using the New-SPOSiteFileVersionBatchDeleteJob cmdlet, you can create batch deletion jobs to remove document versions older than a specified date.
Prerequisites:
SharePoint Online Administrator permissions.
SharePoint Online Management Shell installed.
PowerShell Script: Automating Version Cleanup
Here’s a PowerShell script that retrieves all site collections in your SharePoint Online tenant and creates a batch delete job for document versions older than 365 days:
# Connect to SharePoint Online
$adminSiteUrl = “https://<tenant>-admin.sharepoint.com”
Connect-SPOService -Url $adminSiteUrl
# Get all site collections
$siteCollections = Get-SPOSite -Limit All
# Loop through each site collection
foreach ($site in $siteCollections) {
Write-Host “Processing site: $($site.Url)”
# Create a batch delete job for versions older than 365 days
try {
New-SPOSiteFileVersionBatchDeleteJob -Identity $site.Url -DeleteBeforeDays 365 -confirm:$false
Write-Host “Batch delete job created for site: $($site.Url)”
}
catch {
Write-Host “Error creating batch delete job for site: $($site.Url)”
Write-Host $_.Exception.Message
}
}
# Disconnect from SharePoint Online
Disconnect-SPOService
PnP Powershell:
# Install the PnP PowerShell module if not already installed
# Install-Module -Name “PnP.PowerShell”
# Connect to SharePoint Online
$adminSiteUrl = “https://<tenant>-admin.sharepoint.com”
Connect-PnPOnline -Url $adminSiteUrl -Interactive
# Get all site collections
$siteCollections = Get-PnPTenantSite -IncludeOneDriveSites -Limit All
# Loop through each site collection
foreach ($site in $siteCollections) {
Write-Host “Processing site: $($site.Url)”
try {
# Create a batch delete job for versions older than 365 days
New-PnPSiteFileVersionBatchDeleteJob -SiteUrl $site.Url -LastRetainedVersionDate 365 -confirm:$false
Write-Host “Batch delete job created for site: $($site.Url)”
}
catch {
Write-Host “Error creating batch delete job for site: $($site.Url)”
Write-Host $_.Exception.Message
}
}
# Disconnect from SharePoint Online
Disconnect-PnPOnline
How This Script Works
Connect to SharePoint Online: The script first connects to your SharePoint Online tenant using the Admin Center URL.
Retrieve All Site Collections: It retrieves all site collections using the Get-SPOSite cmdlet.
Set Retention Date: The script calculates the retention date as 365 days before the current date.
Create Batch Delete Job: The New-SPOSiteFileVersionBatchDeleteJob cmdlet creates a batch job to delete versions older the days you’ve set.
Error Handling: If an error occurs, it logs the error and continues to the next site collection.
Best Practices for Intelligent Versioning and Cleanup
Define a Versioning Policy: Decide how many versions you want to retain for different types of documents. This can be configured in library settings or using PowerShell (e.g., -MajorVersionLimit X). I recommend having it in automatic!
Automate Version Cleanup: Use automation to regularly clean up old versions, minimizing the need for manual intervention.
Monitor Storage Usage: Regularly review storage reports from the SharePoint Admin Center to identify areas where storage is being consumed excessively. Make use of SharePoint Archive and Backup for data that isn’t needed on regular basis
Combine with Retention Policies: Protect critical files by applying Retention Labels or Retention Policies to ensure they are not deleted prematurely, especially for legal or compliance reasons.
Conclusion
As your organization grows, efficiently managing your SharePoint Online storage is crucial. Intelligent Versioning helps you retain the benefits of version control while optimizing storage and keeping costs under control.
By using PowerShell to automate version cleanup, you can streamline the process, ensuring that unnecessary versions are deleted without impacting important datam This way you can clean up old data and know that your future data isn’t being eaten up!
https://yourmodernworkplace.com/blog/Free-Storage-In-SharePoint-Online-Using-Intelligent-Versioning Read More
How do I write this formula?
Hi!
I’ve created a workbook with 2 worksheets. They’re named “MW-562 Pay” and “Employee Hours”.
When the user fills in the “MW-562 Pay”, certain bits of the information is pulled from the “MW-562 Pay” worksheet into the “Employee Hours” worksheet.
The formula I’ve used, in the cells on the “Employee Hours” worksheet, to do this is
=’MW-562 Pay’!A14
and I change the A14 from cell to cell as needed. It works great.
Where I’m running into trouble:
Some users will need to add a second “MW-562 Pay” worksheet (and a third, and a fourth, etc.) to accommodate more employees.
To do that, they’ll copy the “MW-562 Pay” worksheet into the workbook. They’ll be instructed to name those additional worksheets as “MW-562 Pay 2” and “MW-562 Pay 3” and “MW-562 Pay 4” and so on.
I need to pre-populate the “Employee Hours” worksheet ahead of time with formulas so that it’s ready to pull the info from the additional “MW-562 Pay” worksheets once they’re added, if they’re added (some users won’t need to add additional worksheets, the one will be enough).
The cells on the “Employee Hours” worksheet that will pull info from the added worksheets need to remain visibly empty until and unless the user adds more “MW-562 Pay” worksheets.
Because I know what the name of the additional worksheets will be, I thought that I could write the formulas ahead of time to pull the info from those yet-to-be added worksheets, for example:
=’MW-562 Pay 2′!A14
And then when someone copies the “MW-562 Pay” and names that added worksheet “MW-562 Pay 2”, the formulas I pre-populated into the “Employee Hours” worksheet would start pulling the needed info from the newly added “MW-562 Pay 2” worksheet.
Unfortunately, I ended up with #REF errors on the “Employee Hours” worksheet in all cells that have formulas pulling info from the yet to be added “MW-562 Pay 2” worksheet.
I understand why I received the #REF errors (because the formulas are referencing a worksheet that isn’t present yet) but I thought, no big, that’ll change once I add the “MW-562 Pay 2” worksheet.
Alas, that didn’t happen, the #REF errors remained.
How would I go about accomplishing my goal which is to have the “Employee Hours” worksheet pre-populated with the formulas that’ll pull the info from the additional worksheets once they’re added? Is it possible?
Thank you so much!! 🙂
Hi! I’ve created a workbook with 2 worksheets. They’re named “MW-562 Pay” and “Employee Hours”. When the user fills in the “MW-562 Pay”, certain bits of the information is pulled from the “MW-562 Pay” worksheet into the “Employee Hours” worksheet. The formula I’ve used, in the cells on the “Employee Hours” worksheet, to do this is =’MW-562 Pay’!A14 and I change the A14 from cell to cell as needed. It works great. Where I’m running into trouble: Some users will need to add a second “MW-562 Pay” worksheet (and a third, and a fourth, etc.) to accommodate more employees. To do that, they’ll copy the “MW-562 Pay” worksheet into the workbook. They’ll be instructed to name those additional worksheets as “MW-562 Pay 2” and “MW-562 Pay 3” and “MW-562 Pay 4” and so on. I need to pre-populate the “Employee Hours” worksheet ahead of time with formulas so that it’s ready to pull the info from the additional “MW-562 Pay” worksheets once they’re added, if they’re added (some users won’t need to add additional worksheets, the one will be enough). The cells on the “Employee Hours” worksheet that will pull info from the added worksheets need to remain visibly empty until and unless the user adds more “MW-562 Pay” worksheets. Because I know what the name of the additional worksheets will be, I thought that I could write the formulas ahead of time to pull the info from those yet-to-be added worksheets, for example: =’MW-562 Pay 2′!A14 And then when someone copies the “MW-562 Pay” and names that added worksheet “MW-562 Pay 2”, the formulas I pre-populated into the “Employee Hours” worksheet would start pulling the needed info from the newly added “MW-562 Pay 2” worksheet. Unfortunately, I ended up with #REF errors on the “Employee Hours” worksheet in all cells that have formulas pulling info from the yet to be added “MW-562 Pay 2” worksheet. I understand why I received the #REF errors (because the formulas are referencing a worksheet that isn’t present yet) but I thought, no big, that’ll change once I add the “MW-562 Pay 2” worksheet. Alas, that didn’t happen, the #REF errors remained. How would I go about accomplishing my goal which is to have the “Employee Hours” worksheet pre-populated with the formulas that’ll pull the info from the additional worksheets once they’re added? Is it possible? Thank you so much!! 🙂 Read More
Quick search outlook 2021 problem
Hello everyone
I have a question about searching in outlook.
I am connected to the exchange server.
When I do a search, it searches for letters for 3 days.
today
last search
To make it search more, I need to click
include older results
How can I make it search not only for 3 days, but for a month at once? (without clicking include older results)
Or should it give a full search (include older results)?
Hello everyoneI have a question about searching in outlook.I am connected to the exchange server.When I do a search, it searches for letters for 3 days.today last search To make it search more, I need to clickinclude older results How can I make it search not only for 3 days, but for a month at once? (without clicking include older results)Or should it give a full search (include older results)? Read More
Tls 1.0 and tls1.1
At the os level I have tls1 and tls1.1 off. Could it be configured under IIS per site?
How can find these individual sites making outbound connections?
Iis10 on windows 2022
Thanks
At the os level I have tls1 and tls1.1 off. Could it be configured under IIS per site?How can find these individual sites making outbound connections? Iis10 on windows 2022 Thanks Read More
Account locked the verification method doesn’t work now
Hi,
I’m writing for someone my girlfriend as her account cannot be retrieved.
She reset her password and cannot sign in since.
She is trying to reset password again but when she tries to verify the 2FA, th eonly option available is by receiving a SMS. But the website keeps telling her that this verification method is not working now, and cannot do anything else.
Can you please help me with that ? I can give you more info if needed.
Hi,I’m writing for someone my girlfriend as her account cannot be retrieved.She reset her password and cannot sign in since.She is trying to reset password again but when she tries to verify the 2FA, th eonly option available is by receiving a SMS. But the website keeps telling her that this verification method is not working now, and cannot do anything else. Can you please help me with that ? I can give you more info if needed. Read More