Category: Microsoft
Category Archives: Microsoft
Struggle to format JSON formatting
Good Morning
I’m using this beautiful Gantt formatting from GitHub but I’m struggling to modify it. – To be clear, I’m trying to create a view similar to an Excel sheet because people can’t cope with too much change at a time!
This is what it looks like.
I want to change two things and neither works. That’s a reflection of my JSON knowledge more than anything else.
I want to remove the inline edit function.And I don’t want to show this at all when another field has a value in it.
This is the code:
{
“$schema”: “https://developer.microsoft.com/json-schemas/sp/v2/column-formatting.schema.json”,
“elmType”: “div”,
“children”: [
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“flex-direction”: “row”,
“width”: “100%”,
“align-items”: “center”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“flex-direction”: “column”,
“width”: “100%”,
“margin-bottom”: “3px”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“justify-content”: “space-between”,
“width”: “100%”,
“white-space”: “nowrap”
},
“attributes”: {
“class”: “ms-fontSize-xs”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“flex-direction”: “row-reverse”
},
“children”: [
{
“elmType”: “div”,
“txtContent”: “=if(loopIndex(‘_startdate’)>1,”,’/’)+[$_startdate]”,
“forEach”: “_startdate in split(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’)),’-‘)”
}
]
},
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“flex-direction”: “row-reverse”
},
“children”: [
{
“elmType”: “div”,
“txtContent”: “=if(loopIndex(‘_enddate’)>1,”,’/’)+[$_enddate]”,
“forEach”: “_enddate in split(substring(@currentField,indexOf(@currentField,’_’)+1,indexOf(@currentField,’)’)),’-‘)”
}
]
}
]
},
{
“elmType”: “div”,
“style”: {
“width”: “100%”,
“border”: “1px solid”,
“height”: “13px”,
“position”: “relative”,
“overflow”: “hidden”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“position”: “absolute”,
“height”: “100%”,
“width”: “=((Number(Date(substring(@currentField,indexOf(@currentField,’^’)+1,indexOf(@currentField,'(‘))))-Number(Date(substring(@currentField,0,indexOf(@currentField,’^’))))+86400000)/(Number(Date(substring(@currentField,indexOf(@currentField,’_’)+1,indexOf(@currentField,’)’))))-Number(Date(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’))))+86400000))*100+’%'”,
“left”: “=((Number(Date(substring(@currentField,0,indexOf(@currentField,’^’))))-Number(Date(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’)))))/(Number(Date(substring(@currentField,indexOf(@currentField,’_’)+1,indexOf(@currentField,’)’))))-Number(Date(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’))))+86400000))*100+’%'”
},
“attributes”: {
“class”: “ms-bgColor-themePrimary”,
“title”: “=[$Start.displayValue]+’ ~ ‘+[$End.displayValue]”
}
},
{
“elmType”: “div”,
“style”: {
“position”: “absolute”,
“height”: “100%”,
“width”: “=(86400000/(Number(Date(substring(@currentField,indexOf(@currentField,’_’)+1,indexOf(@currentField,’)’))))-Number(Date(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’))))+86400000))*100+’%'”,
“left”: “=((Number(@now)-(60*60*1000*12)-Number(Date(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’)))))/(Number(Date(substring(@currentField,indexOf(@currentField,’_’)+1,indexOf(@currentField,’)’))))-Number(Date(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’))))+86400000))*100+’%'”,
“display”: “flex”,
“justify-content”: “center”,
“align-items”: “center”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“width”: “100%”,
“min-width”: “5px”,
“height”: “100%”
},
“attributes”: {
“class”: “ms-bgColor-sharedRed10”,
“title”: “Now”
}
}
]
}
]
}
]
},
{
“elmType”: “div”,
“style”: {
“padding”: “10px”,
“margin-left”: “5px”,
“cursor”: “pointer”,
“border-radius”: “50%”
},
“attributes”: {
“iconName”: “Edit”,
“class”: “ms-bgColor-themeLighter–hover”
},
“customCardProps”: {
“openOnEvent”: “click”,
“directionalHint”: “topCenter”,
“isBeakVisible”: true,
“formatter”: {
“elmType”: “div”,
“style”: {
“display”: “flex”,
“flex-direction”: “column”,
“padding”: “10px 15px”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“flex-direction”: “row”,
“align-items”: “center”
},
“children”: [
{
“elmType”: “div”,
“inlineEditField”: “[$Start]”,
“style”: {
“border”: “1px solid”,
“padding”: “5px 10px”,
“border-radius”: “3px”,
“display”: “flex”,
“align-items”: “center”,
“margin”: “5px”
},
“children”: [
{
“elmType”: “span”,
“txtContent”: “[$Start.displayValue]”
},
{
“elmType”: “span”,
“style”: {
“margin-left”: “8px”
},
“attributes”: {
“iconName”: “Edit”
}
}
]
},
{
“elmType”: “div”,
“txtContent”: “~”,
“style”: {
“margin”: “5px”
}
},
{
“elmType”: “div”,
“inlineEditField”: “[$End]”,
“style”: {
“border”: “1px solid”,
“padding”: “5px 10px”,
“border-radius”: “3px”,
“display”: “flex”,
“align-items”: “center”,
“margin”: “5px”
},
“children”: [
{
“elmType”: “span”,
“txtContent”: “[$End.displayValue]”
},
{
“elmType”: “span”,
“style”: {
“margin-left”: “8px”
},
“attributes”: {
“iconName”: “Edit”
}
}
]
}
]
},
{
“elmType”: “div”,
“txtContent”: “If you changed the date manually instead of using the calendar, press [Enter] after changing the date.”,
“style”: {
“max-width”: “300px”,
“text-align”: “center”
},
“attributes”: {
“class”: “ms-fontSize-s”
}
}
]
}
}
}
]
}
]
}
I would appreciate any suggestions. Whatever I have taken out so far hasn’t changed a thing.
Thanks,
Christine
Good Morning I’m using this beautiful Gantt formatting from GitHub but I’m struggling to modify it. – To be clear, I’m trying to create a view similar to an Excel sheet because people can’t cope with too much change at a time! This is what it looks like. I want to change two things and neither works. That’s a reflection of my JSON knowledge more than anything else.I want to remove the inline edit function.And I don’t want to show this at all when another field has a value in it.This is the code: {
“$schema”: “https://developer.microsoft.com/json-schemas/sp/v2/column-formatting.schema.json”,
“elmType”: “div”,
“children”: [
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“flex-direction”: “row”,
“width”: “100%”,
“align-items”: “center”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“flex-direction”: “column”,
“width”: “100%”,
“margin-bottom”: “3px”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“justify-content”: “space-between”,
“width”: “100%”,
“white-space”: “nowrap”
},
“attributes”: {
“class”: “ms-fontSize-xs”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“flex-direction”: “row-reverse”
},
“children”: [
{
“elmType”: “div”,
“txtContent”: “=if(loopIndex(‘_startdate’)>1,”,’/’)+[$_startdate]”,
“forEach”: “_startdate in split(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’)),’-‘)”
}
]
},
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“flex-direction”: “row-reverse”
},
“children”: [
{
“elmType”: “div”,
“txtContent”: “=if(loopIndex(‘_enddate’)>1,”,’/’)+[$_enddate]”,
“forEach”: “_enddate in split(substring(@currentField,indexOf(@currentField,’_’)+1,indexOf(@currentField,’)’)),’-‘)”
}
]
}
]
},
{
“elmType”: “div”,
“style”: {
“width”: “100%”,
“border”: “1px solid”,
“height”: “13px”,
“position”: “relative”,
“overflow”: “hidden”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“position”: “absolute”,
“height”: “100%”,
“width”: “=((Number(Date(substring(@currentField,indexOf(@currentField,’^’)+1,indexOf(@currentField,'(‘))))-Number(Date(substring(@currentField,0,indexOf(@currentField,’^’))))+86400000)/(Number(Date(substring(@currentField,indexOf(@currentField,’_’)+1,indexOf(@currentField,’)’))))-Number(Date(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’))))+86400000))*100+’%'”,
“left”: “=((Number(Date(substring(@currentField,0,indexOf(@currentField,’^’))))-Number(Date(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’)))))/(Number(Date(substring(@currentField,indexOf(@currentField,’_’)+1,indexOf(@currentField,’)’))))-Number(Date(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’))))+86400000))*100+’%'”
},
“attributes”: {
“class”: “ms-bgColor-themePrimary”,
“title”: “=[$Start.displayValue]+’ ~ ‘+[$End.displayValue]”
}
},
{
“elmType”: “div”,
“style”: {
“position”: “absolute”,
“height”: “100%”,
“width”: “=(86400000/(Number(Date(substring(@currentField,indexOf(@currentField,’_’)+1,indexOf(@currentField,’)’))))-Number(Date(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’))))+86400000))*100+’%'”,
“left”: “=((Number(@now)-(60*60*1000*12)-Number(Date(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’)))))/(Number(Date(substring(@currentField,indexOf(@currentField,’_’)+1,indexOf(@currentField,’)’))))-Number(Date(substring(@currentField,indexOf(@currentField,'(‘)+1,indexOf(@currentField,’_’))))+86400000))*100+’%'”,
“display”: “flex”,
“justify-content”: “center”,
“align-items”: “center”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“width”: “100%”,
“min-width”: “5px”,
“height”: “100%”
},
“attributes”: {
“class”: “ms-bgColor-sharedRed10”,
“title”: “Now”
}
}
]
}
]
}
]
},
{
“elmType”: “div”,
“style”: {
“padding”: “10px”,
“margin-left”: “5px”,
“cursor”: “pointer”,
“border-radius”: “50%”
},
“attributes”: {
“iconName”: “Edit”,
“class”: “ms-bgColor-themeLighter–hover”
},
“customCardProps”: {
“openOnEvent”: “click”,
“directionalHint”: “topCenter”,
“isBeakVisible”: true,
“formatter”: {
“elmType”: “div”,
“style”: {
“display”: “flex”,
“flex-direction”: “column”,
“padding”: “10px 15px”
},
“children”: [
{
“elmType”: “div”,
“style”: {
“display”: “flex”,
“flex-direction”: “row”,
“align-items”: “center”
},
“children”: [
{
“elmType”: “div”,
“inlineEditField”: “[$Start]”,
“style”: {
“border”: “1px solid”,
“padding”: “5px 10px”,
“border-radius”: “3px”,
“display”: “flex”,
“align-items”: “center”,
“margin”: “5px”
},
“children”: [
{
“elmType”: “span”,
“txtContent”: “[$Start.displayValue]”
},
{
“elmType”: “span”,
“style”: {
“margin-left”: “8px”
},
“attributes”: {
“iconName”: “Edit”
}
}
]
},
{
“elmType”: “div”,
“txtContent”: “~”,
“style”: {
“margin”: “5px”
}
},
{
“elmType”: “div”,
“inlineEditField”: “[$End]”,
“style”: {
“border”: “1px solid”,
“padding”: “5px 10px”,
“border-radius”: “3px”,
“display”: “flex”,
“align-items”: “center”,
“margin”: “5px”
},
“children”: [
{
“elmType”: “span”,
“txtContent”: “[$End.displayValue]”
},
{
“elmType”: “span”,
“style”: {
“margin-left”: “8px”
},
“attributes”: {
“iconName”: “Edit”
}
}
]
}
]
},
{
“elmType”: “div”,
“txtContent”: “If you changed the date manually instead of using the calendar, press [Enter] after changing the date.”,
“style”: {
“max-width”: “300px”,
“text-align”: “center”
},
“attributes”: {
“class”: “ms-fontSize-s”
}
}
]
}
}
}
]
}
]
} I would appreciate any suggestions. Whatever I have taken out so far hasn’t changed a thing. Thanks, Christine Read More
Manual Sleep option not working
This Win11 option used to work fine. Now it just returns you directly to the login screen. Any ideas on how to get this working again? Many thanks.
This Win11 option used to work fine. Now it just returns you directly to the login screen. Any ideas on how to get this working again? Many thanks. Read More
Troubleshooting – Survey does send out
Hi All,
This is likely in the wrong spot and I’m suggesting making a troubleshooting community specifically for Glint. I’m so curious as to how there would not be more resources when Viva Glint requires either a manual import or sftp. Did everyone else have a much easier go? At any rate, I think I have the Viva Glint system sorted, however, my Survey’s are simply not sending out. When I go through all the steps and run an export of who is was sent to, the downloaded file has no user
Hi All, This is likely in the wrong spot and I’m suggesting making a troubleshooting community specifically for Glint. I’m so curious as to how there would not be more resources when Viva Glint requires either a manual import or sftp. Did everyone else have a much easier go? At any rate, I think I have the Viva Glint system sorted, however, my Survey’s are simply not sending out. When I go through all the steps and run an export of who is was sent to, the downloaded file has no user Read More
Announcing: General Availability of Azure API Center extension for Visual Studio Code
About Azure API Center extension for Visual Studio Code
The Azure API Center extension for Visual Studio Code enables developers to build, discover, try, and consume APIs in your API center:
Build APIs – Make APIs you’re building discoverable to others by registering them in your API center. Shift-left API design conformance checks into Visual Studio Code with integrated linting support. Ensure that new API versions don’t break API consumers with breaking change detection.
Discover APIs – Browse the APIs in your API center, and view their details and documentation.
Try APIs – Use Swagger UI or REST client to explore API requests and responses.
Consume APIs – Generate API SDK clients for your favorite language including JavaScript, TypeScript, .NET, Python, and Java, using the Microsoft Kiota engine that generates SDKs for Microsoft Graph, GitHub, and more.
Register APIs
Register an API in your API center directly from Visual Studio Code, either by registering it as a one-time operation or with a CI/CD pipeline.
API design conformance
To ensure design conformance with organizational standards as you build APIs, the Azure API Center extension for Visual Studio Code provides integrated support for API specification linting with Spectral.
Breaking change detection
When introducing new versions of your API, it’s important to ensure that changes introduced do not break API consumers on previous versions of your API. The Azure API Center extension for Visual Studio Code makes this easy with breaking change detection for OpenAPI specification documents powered by Optic.
Discover APIs
Your API center resources appear in the tree view on the left-hand side. Expand an API center resource to see APIs, versions, definitions, environments, and deployments.
View API documentation
You can view the documentation for an API definition in your API center and try API operations. This feature is only available for OpenAPI-based APIs in your API center.
Generate HTTP file
You can view a .http file based on the API definition in your API center. If the REST Client extension is installed, you can make requests directory from the Visual Studio Code editor. This feature is only available for OpenAPI-based APIs in your API center.
Generate API client
Use the Microsoft Kiota extension to generate an API client for your favorite language. This feature is only available for OpenAPI-based APIs in your API center.
Export API specification
You can export an API specification from a definition and then download it as a file.
Get started with Azure API Center extension for Visual Studio Code now!
Microsoft Tech Community – Latest Blogs –Read More
Document default font not working
I noticed setting the control to set a default font is now vaporware – it does not work. Any idea when MS will restore this function?
I noticed setting the control to set a default font is now vaporware – it does not work. Any idea when MS will restore this function? Read More
ZylonkOS:For all
Dear Microsoft,I am Matthew, the owner of Zylonkcompany
, creator of ZylonkOS, a new operating system featuring a robust kernel and advanced AI functionalities. We are enthusiastic about the current technological revolution, particularly in AI, and its potential for shaping the future of PCs. We are keen to introduce ZylonkOS and explore potential opportunities for collaboration or feedback from your specialized team. We believe our innovations can complement the current technological ecosystem, enhancing user experiences and advancing AI capabilities in operating systems.Thank you for your attention. We are available to provide further information or arrange a discussion at your convenience.Best regards,Matthew
Owner, Zylonkcompany
Email: email address removed for privacy reasons
Commercial and Support Contact
Dear Microsoft,I am Matthew, the owner of Zylonkcompany , creator of ZylonkOS, a new operating system featuring a robust kernel and advanced AI functionalities. We are enthusiastic about the current technological revolution, particularly in AI, and its potential for shaping the future of PCs. We are keen to introduce ZylonkOS and explore potential opportunities for collaboration or feedback from your specialized team. We believe our innovations can complement the current technological ecosystem, enhancing user experiences and advancing AI capabilities in operating systems.Thank you for your attention. We are available to provide further information or arrange a discussion at your convenience.Best regards,MatthewOwner, Zylonkcompany Email: email address removed for privacy reasonsCommercial and Support Contact Read More
Validate CSV files before ingestion in Microsoft Data Factory Pipelines
A very common task for Microsoft Fabric, Azure Data Factory and Synapse Analytics Pipelines is to receive unstructured files, land them in an Azure Data Lake (ADLS Gen2) and load them into structured tables. This often leads to a very common issue with unstructured files when “SOMETHING HAS CHANGED” and the unstructured file does not meet the defined table format. If issues are not handled properly within the pipeline, the data workloads will fail and users will be asking “WHERE’S MY DATA???” You then need to communicate with the owner of the file, have them fix the issues, then rerun the pipeline after the issues have been fixed. Along with unhappy users, rerunning failed pipelines adds cost. Validating these files before they are processed allows your pipelines to continue ingesting files that do have the correct format. For pipelines that do fail, your code or process can pinpoint what caused the error, leading to faster resolution of the issue. In this blog, we’ll walk through a Microsoft Fabric Data Factory Pipeline that validates incoming CSV files for common errors before loading to a Microsoft Fabric Lakehouse delta table.
Overview
This source files in this process are in an Azure Data Lake storage account, which has a shortcut in the Fabric Lakehouse. A data pipeline calls a Spark notebook to check the file for any new or missing columns, any invalid data for the expected data type, or any duplicate key values. If the file has no errors, the pipeline loads the CSV data into a parquet file and then calls another Spark notebook to load the parquet file into a delta table in the Lakehouse. Otherwise if there are errors in the file, the pipeline sends a notification email.
Source files and metadata files
In this solution, files are landing in an ADLS Gen 2 container folder called scenario1-validatecsv which has a shortcut to it in the Fabric Lakehouse. The files folder contains the files to process; the metadata folder contains a file describing the format each CSV file type should conform to.
This solution is to load to a table called customer, which has columns number, name, city and state. In the format definition file, customer_meta, there’s a row for each customer table column, providing the column name, the column data type, and whether or not it is a key value. This metadata file is later used in a Spark notebook to validate that the incoming file conforms to this format.
Orchestrator Pipeline
The orchestrator pipeline is very simple – since I am running my pipeline as a scheduled batch, it loops through the files folder and invokes another pipeline for each file. Note the parametrization of the lakehouse path, the source folders, the destination folder and the file format. This allows the same process to be run for any lakehouse and for any file format/table to load.
For Each activity
When invoking the child pipeline from the For Each activity, it passes in the parameter values from the orchestrator pipeline plus the name of the current file being processed and the metadata file name, which is the file format name with ‘_meta’ appended to it.
Child pipeline – Validate and load pipeline
The Validate and load pipeline validates the current CSV file, and if the file conforms to the format, loads it into a parquet table then merges the parquet data into a delta table.
1 – Parameters
Parameters passed in from the orchestrator pipeline for the current CSV file to process
2 – Set variable activity – Set parquet file name
Removes .csv to define the parquet file name
3- Notebook activity – Validate CSV and load file to parquet
Calls the notebook, passing in parameter and variable values
Below is the pyspark code for the notebook. It gets the column names and inferred data types from the CSV file as well as the column names, data types and key field names from the metadata file. It checks if the column names match, if the data types match, if there are keys defined for the file, and finally if there are any duplicate key values in the incoming file. If there were duplicate key fields, it writes the duplicate key values to a file. If the names and data types match and there are no duplicate key values, it writes the file to parquet and passes back the key field names from the metadata file; otherwise, it returns the appropriate error message.
# Files/landingzone/files parameters
lakehousepath = ‘abfss://xxxxx@msit-onelake.dfs.fabric.microsoft.com/xxxxx’
filename = ‘customer_good.csv’
outputfilename = ‘customer_good’
metadatafilename = ‘customer_meta.csv’
filefolder = ‘scenario1-validatecsv/landingzone/files’
metadatafolder = ‘scenario1-validatecsv/landingzone/metadata’
outputfolder = ‘scenario1-validatecsv/bronze’
fileformat = ‘customer’
# Import pandas and pyarrow
import pandas as pd
import pyarrow as pa
import pyarrow.parquet as pq
# Set path variables
inputfilepath = f'{lakehousepath}/Files/{filefolder}/’
metadatapath = f'{lakehousepath}/Files/{metadatafolder}/’
outputpath = f'{lakehousepath}/Files/{outputfolder}/’
# Read the text file and the metadata file
print(f'{inputfilepath}{filename}’)
data = pd.read_csv(f'{inputfilepath}{filename}’)
meta = pd.read_csv(f'{metadatapath}{metadatafilename}’)
# only get the column names for the file formattype that was input
meta = meta.loc[meta[‘formatname’] == fileformat]
print(data.dtypes)
print(list(meta[‘columname’]))
# get any key fields specified
keyfields = meta.loc[meta[‘iskeyfield’] == 1, ‘columname’].tolist()
print(keyfields)
# Check for errors in CSV
haserror = 0
# Check if the column names match
if list(data.columns) != list(meta[“columname”]):
# Issue an error
result = “Error: Column names do not match.”
haserror = 1
else:
# Check if the datatypes match
if list(data.dtypes) != list(meta[“datatype”]):
# Issue an error
result = “Error: Datatypes do not match.”
haserror = 1
else:
# If the file has key fields, check if there are any duplicate keys
# if there are duplicate keys, also write the duplicate key values to a file
if keyfields != ”:
checkdups = data.groupby(keyfields).size().reset_index(name=’count’)
print(checkdups)
if checkdups[‘count’].max() > 1:
dups = checkdups[checkdups[‘count’] > 1]
print(dups)
haserror = 1
(dups.to_csv(f'{lakehousepath}/Files/processed/error_duplicate_key_values/duplicaterecords_{filename}’,
mode=’w’,index=False))
result = ‘Error: Duplicate key values’
if haserror == 0:
# Write the data to parquet if no errors
df = spark.read.csv(f”{inputfilepath}{filename}”, header=True, inferSchema=True)
print(f’File is: {inputfilepath}{filename}’)
display(df)
df.write.mode(“overwrite”).format(“parquet”).save(f”{outputpath}{outputfilename}”)
result = f”Data written to parquet successfully. Key fields are:{keyfields} “
mssparkutils.notebook.exit(str(result))
4 – Copy data activity – Move File to processed folder
This Copy Data activity essentially moves the csv file from the ADLS Gen 2 files folder to a processed folder in the Fabric Lakehouse
Destination folder name is derived from the Notebook exit value, which returns success or the error message
5 – If condition activity: If File Validated
Check if the CSV file was successfully validated and loaded to parquet
5a – File validated successfully
If there were no errors in the file, call the spark notebook to merge the parquet file written from the previous pyspark notebook to the delta table.
Parameters for the lakehousepath, the parquet file path and name, the table name, and the key fields passed in. As shown above, the key fields were derived from the previous pyspark notebook and are passed into the Create or Merge to Table notebook.
Below is the spark notebook code. If the delta table already exists and there are key fields, it builds a string expression to be used on the pyspark merge statement and then performs the merge on the delta table. If there are no key fields or the table does not exist, it writes or overwrites the delta table.
# create or merge to delta
# input parameters below
lakehousepath = ‘abfss://xxxe@yyyy.dfs.fabric.microsoft.com/xxx’
inputfolder = ‘scenario1-validatecsv/bronze’
filename = ‘customergood’
tablename = ‘customer’
keyfields = “[‘number’]”
# define paths
outputpath = f'{lakehousepath}/Tables/{tablename}’
inputpath = f'{lakehousepath}/Files/{inputfolder}/{filename}’
# import delta table and sql functions
from delta.tables import *
from pyspark.sql.functions import *
# get list of key values
keylist = eval(keyfields)
print(keylist)
# read input parquet file
df2 = spark.read.parquet(inputpath)
# display(df2)
# if there are keyfields define in the table, build the merge key expression
if keyfields != None:
mergekey = ”
keycount = 0
for key in keylist:
mergekey = mergekey + f’t.{key} = s.{key} AND ‘
mergeKeyExpr = mergekey.rstrip(‘ AND’)
print(mergeKeyExpr)
# if table exists and if table should be upserted as indicated by the merge key, do an upsert and return how many rows were inserted and updated;
# if it does not exist or is a full load, overwrite existing table return how many rows were inserted
if DeltaTable.isDeltaTable(spark,outputpath) and mergeKeyExpr is not None:
deltaTable = DeltaTable.forPath(spark,outputpath)
deltaTable.alias(“t”).merge(
df2.alias(“s”),
mergeKeyExpr
).whenMatchedUpdateAll().whenNotMatchedInsertAll().execute()
history = deltaTable.history(1).select(“operationMetrics”)
operationMetrics = history.collect()[0][“operationMetrics”]
numInserted = operationMetrics[“numTargetRowsInserted”]
numUpdated = operationMetrics[“numTargetRowsUpdated”]
else:
df2.write.format(“delta”).mode(“overwrite”).save(outputpath)
numInserted = df2.count()
numUpdated = 0
print(numInserted)
result = “numInserted=”+str(numInserted)+ “|numUpdated=”+str(numUpdated)
mssparkutils.notebook.exit(str(result))
5b – File validation failed
If there was an error in the CSV file, send an email notification
Summary
Building resilient and efficient data pipelines is critical no matter your ETL tool or data sources. Thinking ahead to what types of problems can, and inevitably will, occur and incorporating data validation into your pipelines will save you a lot of headaches when those pipelines are moved into production. The examples in this blog are just a few of the most common errors with CSV files. Get ahead of those data issues and resolve them without last minute fixes and disrupting other processes! You can easily enhance the methods in this blog by including other validations or validating other unstructured file types like Json. You can change the pipeline to run as soon as the unstructured file is loaded into ADLS rather than in batch. Using techniques like this to reduce hard errors gives your pipelines (and yourself!) more credibility!
Microsoft Tech Community – Latest Blogs –Read More
Azure Linux Partner Showcase: Isovalent
This month’s partner showcase is with our partner Isovalent which is now a part of Cisco. We are highlighting how Cilium, Azure Linux and Azure Kubernetes Service bring value to our mutual customers. The Azure Linux team is excited to have Isovalent as a partner!
From Isovalent:
“Azure Kubernetes Service (AKS) uses Cilium natively, wherein AKS combines the robust control plane of Azure CNI with Cilium’s data plane to provide high-performance networking and security. Isovalent Cilium Enterprise is an enterprise-grade, hardened distribution of open-source projects Cilium, Hubble, and Tetragon, built and supported by the Cilium creators.”
The already expansive tutorial from Isovalent in the blog link below has now been updated to highlight how users can perform OS SKU migration (preview).
Please see the Isovalent blog for more information – Cilium, Azure Linux, and Azure Kubernetes Service come together. – Isovalent
Find Isovalent Container Offer for Kubernetes on the Azure Marketplace – https://azuremarketplace.microsoft.com/marketplace/apps/isovalentinc1662143158090.isovalent-cilium-enterprise?tab=Overview
Find Isovalent on the AKS Partner Solutions page – https://learn.microsoft.com/en-us/azure/aks/azure-linux-aks-partner-solutions#isovalent
Microsoft Tech Community – Latest Blogs –Read More
Email Distribution List in Outlook for External Users
Hi,
Is there a way to create a distribution list in Outlook 365 for a group of external users e.g. my clients?
The important requirement is that they should not be able to see each others’ email addresses, nor should they be able to email each other by clicking reply or reply all. So, it’s kind of an old fashioned distribution list but only I can send emails to members and if anyone replies to the email, it should come back to me.
I checked the “Contact Group” in Outlook but that displays each member’s name/email so that wouldn’t work.
Hi, Is there a way to create a distribution list in Outlook 365 for a group of external users e.g. my clients? The important requirement is that they should not be able to see each others’ email addresses, nor should they be able to email each other by clicking reply or reply all. So, it’s kind of an old fashioned distribution list but only I can send emails to members and if anyone replies to the email, it should come back to me. I checked the “Contact Group” in Outlook but that displays each member’s name/email so that wouldn’t work. Read More
First Fundraiser!
Hello all!
I’m Celia, CEO of the African Future Pledge 501c3 Organization.
I’m neck-deep in our first www.AfricanFuturePledge.com fundraising project and about to send out donation letters to raise funds for prizes for our young contestants. I hope this platform will support our mission and our fundraising projects. We are definitely rookies, so to speak, but we are also very determined to support our young generations to the fullest. It’s nice to be among you and I’m looking to learn all I can that will support our efforts.
Hello all!I’m Celia, CEO of the African Future Pledge 501c3 Organization.I’m neck-deep in our first www.AfricanFuturePledge.com fundraising project and about to send out donation letters to raise funds for prizes for our young contestants. I hope this platform will support our mission and our fundraising projects. We are definitely rookies, so to speak, but we are also very determined to support our young generations to the fullest. It’s nice to be among you and I’m looking to learn all I can that will support our efforts. Read More
Secure Time Seeding (STS) strikes again – disable STS before it happens to you
Had an incident this week where time jumped forward by several months for no apparent reason on a domain controller. This has all the hallmarks of the STS issue written here:
We are planning to disable STS on all servers before this happens again. If you haven’t heard about this issue, read the article above.
Had an incident this week where time jumped forward by several months for no apparent reason on a domain controller. This has all the hallmarks of the STS issue written here:https://arstechnica.com/security/2023/08/windows-feature-that-resets-system-clocks-based-on-random-data-is-wreaking-havoc/3/We are planning to disable STS on all servers before this happens again. If you haven’t heard about this issue, read the article above. Read More
Enhanced Collaboration in Custom Environment
Does anyone know if the enhanced collaboration is available in a custom environment?
Does anyone know if the enhanced collaboration is available in a custom environment? Read More
Future of Project Accelerator
I recently deployed Project Accelerator and have been using it in production. I am concerned over the future of the platform. Does anyone have any insight on the future of this platform?
I recently deployed Project Accelerator and have been using it in production. I am concerned over the future of the platform. Does anyone have any insight on the future of this platform? Read More
Copilot for Microsoft 365 limitations with document size
Hey,
I’m trying to understand what the limitations with files size within Copilot for M365 are.
For example, when I ask to summarize a very large file, I’m not sure what is the size limit in terms of number of words or MB.
Another example, when I create a Word document or a PPT document from a Word file or a PDF file, I’m not sure what the size limit is for the referred document.
Any clue, any links, any data?
Hey,I’m trying to understand what the limitations with files size within Copilot for M365 are.For example, when I ask to summarize a very large file, I’m not sure what is the size limit in terms of number of words or MB.Another example, when I create a Word document or a PPT document from a Word file or a PDF file, I’m not sure what the size limit is for the referred document.Any clue, any links, any data? Read More
Calculate Year’s Past
Hello All – I’m looking to convert many rows within a table to a Year’s left equation instead of doing it all by hand. The result should be,
Year New – 2020
5 Years before Expiration
Years remaining – 1
Year New 2018
5 Years before Expiration
Years Remaining – (-1)
I have all my years of expiration in a separate box but trying to come up with a simple formula that calculates the years remaining but putting it simply as a 5 years left or -1 years left. Any advice would be appreciated, thank you
Hello All – I’m looking to convert many rows within a table to a Year’s left equation instead of doing it all by hand. The result should be, Year New – 2020 5 Years before Expiration Years remaining – 1 Year New 2018 5 Years before ExpirationYears Remaining – (-1) I have all my years of expiration in a separate box but trying to come up with a simple formula that calculates the years remaining but putting it simply as a 5 years left or -1 years left. Any advice would be appreciated, thank you Read More
Uusi Bit.get-suosituskoodi: qp29 (uusi rekisteröinti 2024)
Paras B I T GE T -viitekoodi vuodelle 2024 on “qp29”. Käytä tätä koodia saadaksesi 30% alennuksen kaupoista. Lisäksi uudet käyttäjät, jotka rekisteröityvät B I T GE T -palveluun käyttämällä tarjouskoodia “qp29“, voivat saada eksklusiivisen palkinnon, joka on jopa 5005 USDT.
B I T GE T -viitekoodin qp29 edut
B I T GE T -viitekoodi qp29 tarjoaa loistavan tavan säästää kaupankäyntikuluissa ja ansaita samalla houkuttelevia palkintoja. Syöttämällä tämän koodin saat pysyvän 30 % alennuksen kaupankäyntikuluistasi. Lisäksi, jos jaat henkilökohtaisen viittauskoodisi ystäviesi kanssa, voit saada 50 % bonuksen heidän kaupankäyntikuluistaan. Hyödynnä tämä tilaisuus kasvattaaksesi tulojasi ja tuomalla uusia käyttäjiä alustalle.
Paras B I T GE T -viitekoodi vuodelle 2024
Suositeltu B I T GE T -viitekoodi vuodelle 2024 on qp29. Kun rekisteröidyt tällä koodilla, voit saada jopa 5005 USDT bonuksena. Jaa tämä koodi ystävillesi ansaitaksesi 50 % provisiota, mikä auttaa sinua varmistamaan enintään 5005 USDT:n rekisteröintibonuksen. Tämä on loistava tapa parantaa kaupankäyntikokemustasi lisäetuilla ja kannustaa muita osallistumaan.
Kuinka käyttää B I T GE T -viitekoodia
B I T GE T -viitekoodi on tarkoitettu erityisesti uusille käyttäjille, jotka eivät ole vielä rekisteröityneet alustalle. Käytä koodia seuraavasti:
Vieraile B I T GE T -sivustolla ja napsauta “Kirjaudu sisään”.
Anna käyttäjätietosi ja käy läpi KYC- ja AML-menettelyt.
Kun sinua pyydetään antamaan viittauskoodi, kirjoita qp29.
Suorita rekisteröintiprosessi ja suorita tarvittavat vahvistukset.
Kun kaikki ehdot täyttyvät, saat heti tervetuliaisbonuksesi.
Miksi käyttää B I T GE T -viitekoodia?
Pysyvä alennus: Koodilla qp29 saat automaattisesti 30 % alennuksen kaikista kaupankäyntipalkkioista.
Runsas tervetuliaisbonus: Uudet käyttäjät voivat saada jopa 5005 USDT.
Lisätulot: Jaa koodisi ja ansaitse 50 % provisio.
Hyödynnä tämä tilaisuus ja varmista etusi nykyisellä B I T GE T -viitekoodilla qp29! Saat jopa 5005 USDT ja hyödynnä pysyviä alennuksia kaupankäyntikuluistasi.
Paras B I T GE T -viitekoodi vuodelle 2024 on “qp29”. Käytä tätä koodia saadaksesi 30% alennuksen kaupoista. Lisäksi uudet käyttäjät, jotka rekisteröityvät B I T GE T -palveluun käyttämällä tarjouskoodia “qp29”, voivat saada eksklusiivisen palkinnon, joka on jopa 5005 USDT.B I T GE T -viitekoodin qp29 edutB I T GE T -viitekoodi qp29 tarjoaa loistavan tavan säästää kaupankäyntikuluissa ja ansaita samalla houkuttelevia palkintoja. Syöttämällä tämän koodin saat pysyvän 30 % alennuksen kaupankäyntikuluistasi. Lisäksi, jos jaat henkilökohtaisen viittauskoodisi ystäviesi kanssa, voit saada 50 % bonuksen heidän kaupankäyntikuluistaan. Hyödynnä tämä tilaisuus kasvattaaksesi tulojasi ja tuomalla uusia käyttäjiä alustalle.Paras B I T GE T -viitekoodi vuodelle 2024Suositeltu B I T GE T -viitekoodi vuodelle 2024 on qp29. Kun rekisteröidyt tällä koodilla, voit saada jopa 5005 USDT bonuksena. Jaa tämä koodi ystävillesi ansaitaksesi 50 % provisiota, mikä auttaa sinua varmistamaan enintään 5005 USDT:n rekisteröintibonuksen. Tämä on loistava tapa parantaa kaupankäyntikokemustasi lisäetuilla ja kannustaa muita osallistumaan.Kuinka käyttää B I T GE T -viitekoodiaB I T GE T -viitekoodi on tarkoitettu erityisesti uusille käyttäjille, jotka eivät ole vielä rekisteröityneet alustalle. Käytä koodia seuraavasti:Vieraile B I T GE T -sivustolla ja napsauta “Kirjaudu sisään”.Anna käyttäjätietosi ja käy läpi KYC- ja AML-menettelyt.Kun sinua pyydetään antamaan viittauskoodi, kirjoita qp29.Suorita rekisteröintiprosessi ja suorita tarvittavat vahvistukset.Kun kaikki ehdot täyttyvät, saat heti tervetuliaisbonuksesi.Miksi käyttää B I T GE T -viitekoodia?Pysyvä alennus: Koodilla qp29 saat automaattisesti 30 % alennuksen kaikista kaupankäyntipalkkioista.Runsas tervetuliaisbonus: Uudet käyttäjät voivat saada jopa 5005 USDT.Lisätulot: Jaa koodisi ja ansaitse 50 % provisio.Hyödynnä tämä tilaisuus ja varmista etusi nykyisellä B I T GE T -viitekoodilla qp29! Saat jopa 5005 USDT ja hyödynnä pysyviä alennuksia kaupankäyntikuluistasi. Read More
Deleted Users
Hi all,
We have E5 Compliance licenses. I’ve been asked to set a retention policy of five years just for current employees. If I use a Static scope of all users for the retention policy, what happens when a user leaves.
So after the deleted user is soft deleted then hard deleted, are their associated retained emails also deleted. I know we can use inactive mailboxes and legal holds if we want to keep the email, but just wondering about what happens if we don’t.
Hi all, We have E5 Compliance licenses. I’ve been asked to set a retention policy of five years just for current employees. If I use a Static scope of all users for the retention policy, what happens when a user leaves. So after the deleted user is soft deleted then hard deleted, are their associated retained emails also deleted. I know we can use inactive mailboxes and legal holds if we want to keep the email, but just wondering about what happens if we don’t. Read More
Új Bit.get ajánlási kód: qp29 (új regisztráció 2024)
A legjobb B I T GE T ajánlókód 2024-re a „qp29”. Használja ezt a kódot, hogy 30% kedvezményt kapjon a kereskedésekből. Ezenkívül az új felhasználók, akik a „qp29” promóciós kóddal regisztrálnak a B I T GE T-re, akár 5005 USDT exkluzív jutalomban is részesülhetnek.
A B I T GE T ajánlókód qp29 előnyei
A B I T GE T qp29 ajánlókód nagyszerű lehetőséget kínál a kereskedési díjak megtakarítására, miközben vonzó jutalmakat keres. A kód megadásával állandó 30% kedvezményt kap a kereskedési díjaiból. Ezenkívül, ha megosztja személyes ajánlókódját barátaival, 50% bónuszt kaphat a kereskedési díjakra. Használja ki ezt a lehetőséget, hogy növelje bevételeit, miközben új felhasználókat hozzon a platformra.
A legjobb B I T GE T ajánlókód 2024-re
Az ajánlott B I T GE T ajánlókód 2024-re a qp29. Ha ezzel a kóddal regisztrál, akár 5005 USDT-t is kaphat bónuszként. Oszd meg ezt a kódot barátaiddal, hogy 50%-os jutalékot kapj, így biztosíthatod a maximum 5005 USDT értékű regisztrációs bónuszt. Ez egy nagyszerű módja annak, hogy további előnyökkel javítsa kereskedési élményét, miközben másokat is a részvételre ösztönöz.
A B I T GE T ajánlókód használata
A B I T GE T ajánlókód kifejezetten azoknak az új felhasználóknak szól, akik még nem regisztráltak a platformon. A kód használatához kövesse az alábbi lépéseket:
Látogassa meg a B I T GE T webhelyet, és kattintson a „Bejelentkezés” gombra.
Adja meg felhasználói adatait, és végezze el a KYC és AML eljárásokat.
Amikor a rendszer kéri az ajánlókódot, írja be a qp29 kódot.
Végezze el a regisztrációs folyamatot, és végezze el a szükséges ellenőrzéseket.
Ha minden feltétel teljesül, azonnal megkapja az üdvözlő bónuszt.
Miért használja a B I T GE T ajánlókódot?
Állandó kedvezmény: A qp29 kóddal automatikusan 30% kedvezményt kap minden kereskedési jutalékból.
Nagyvonalú üdvözlő bónusz: Az új felhasználók akár 5005 USDT-t is kaphatnak.
További bevétel: Ossza meg kódját, és szerezzen 50% jutalékot.
Használja ki ezt a lehetőséget, és biztosítsa előnyeit a jelenlegi B I T GE T qp29 ajánlókóddal! Kapjon akár 5005 USDT-t, és részesüljön állandó kedvezményekben kereskedési díjaiból.
A legjobb B I T GE T ajánlókód 2024-re a „qp29”. Használja ezt a kódot, hogy 30% kedvezményt kapjon a kereskedésekből. Ezenkívül az új felhasználók, akik a „qp29” promóciós kóddal regisztrálnak a B I T GE T-re, akár 5005 USDT exkluzív jutalomban is részesülhetnek.A B I T GE T ajánlókód qp29 előnyeiA B I T GE T qp29 ajánlókód nagyszerű lehetőséget kínál a kereskedési díjak megtakarítására, miközben vonzó jutalmakat keres. A kód megadásával állandó 30% kedvezményt kap a kereskedési díjaiból. Ezenkívül, ha megosztja személyes ajánlókódját barátaival, 50% bónuszt kaphat a kereskedési díjakra. Használja ki ezt a lehetőséget, hogy növelje bevételeit, miközben új felhasználókat hozzon a platformra.A legjobb B I T GE T ajánlókód 2024-reAz ajánlott B I T GE T ajánlókód 2024-re a qp29. Ha ezzel a kóddal regisztrál, akár 5005 USDT-t is kaphat bónuszként. Oszd meg ezt a kódot barátaiddal, hogy 50%-os jutalékot kapj, így biztosíthatod a maximum 5005 USDT értékű regisztrációs bónuszt. Ez egy nagyszerű módja annak, hogy további előnyökkel javítsa kereskedési élményét, miközben másokat is a részvételre ösztönöz.A B I T GE T ajánlókód használataA B I T GE T ajánlókód kifejezetten azoknak az új felhasználóknak szól, akik még nem regisztráltak a platformon. A kód használatához kövesse az alábbi lépéseket:Látogassa meg a B I T GE T webhelyet, és kattintson a „Bejelentkezés” gombra.Adja meg felhasználói adatait, és végezze el a KYC és AML eljárásokat.Amikor a rendszer kéri az ajánlókódot, írja be a qp29 kódot.Végezze el a regisztrációs folyamatot, és végezze el a szükséges ellenőrzéseket.Ha minden feltétel teljesül, azonnal megkapja az üdvözlő bónuszt.Miért használja a B I T GE T ajánlókódot?Állandó kedvezmény: A qp29 kóddal automatikusan 30% kedvezményt kap minden kereskedési jutalékból.Nagyvonalú üdvözlő bónusz: Az új felhasználók akár 5005 USDT-t is kaphatnak.További bevétel: Ossza meg kódját, és szerezzen 50% jutalékot.Használja ki ezt a lehetőséget, és biztosítsa előnyeit a jelenlegi B I T GE T qp29 ajánlókóddal! Kapjon akár 5005 USDT-t, és részesüljön állandó kedvezményekben kereskedési díjaiból. Read More
Announcing new Windows Autopilot onboarding experience for government and commercial customers
Organizations are increasingly adopting a hybrid workplace and Windows Autopilot provides flexibility to deliver devices to users anywhere with internet connectivity. With more and more adoption of Windows Autopilot, Microsoft Intune is enhancing this solution to support a greater variety of scenarios and use cases.
Today, Intune is releasing a new Autopilot profile experience, Windows Autopilot device preparation, which enables IT admins to deploy configurations efficiently and consistently and removes the complexity of troubleshooting for both commercial and government (Government Community Cloud (GCC) High, and U.S. Department of Defense (DoD)) organizations and agencies.
What is Windows Autopilot device preparation and why was it created?
While the existing Windows Autopilot experience supports multiple scenarios and device types, we’re extending this value across additional cloud instances and improving consistency and troubleshooting capabilities based on customer feedback. We’re introducing Autopilot device preparation in a way that won’t interrupt current deployments or experience and provides a more consistent and efficient experience.
Among some of the benefits of Autopilot device preparation are:
Availability in government clouds (GCC High and DoD) which will allow government customers to deploy at scale using Autopilot.
Providing more consistency in user experience during deployments by locking in IT admins intentions for onboarding.
Creating more error resiliency in the experience to allow users to recover without needing to call a help desk.
Sharing more insight into the Autopilot process with new reporting details.
A single Autopilot device preparation profile to configure deployment and OOBE settings
The Autopilot device preparation admin experience simplifies admin configuration by having a single profile to provision all policies in one location, including deployment settings and out-of-box (OOBE) settings. It also improves the consistency of the experience for users and gets them to the desktop faster by allowing you to select which apps (line-of-business (LOB), Win32, or Store apps) and PowerShell scripts must be delivered during OOBE.
Grouping at enrollment time
An improved grouping experience places devices in a group at the time of enrollment. Simply assign all configurations to a device security group and include the group as part of the device preparation profile. The configuration will be saved and then delivered on the device as soon as the user authenticates during OOBE.
New user experience in OOBE
A simplified OOBE view shows the progress of the deployment in percentage % so that users know how far along in the process they are. When the device preparation configuration has been delivered to the device, the user will be informed that critical setup is complete, and they can continue to the desktop.
The Autopilot device preparation deployment report
The new Autopilot device preparation deployment report captures the status of each deployment in near real-time and provides detailed information to help with troubleshooting. Here are some highlights of what to expect:
Easily track which devices went through Autopilot
Track status and deployment phase in near-real-time
Expand more details for each deployment:
Device details
Profile name and version
Deployment status details
Apps applied with status
Scripts applied with status
Coming soon: Corporate identifiers for Windows
While we don’t have a tenant association feature ready in this initial release, we understand the importance of only allowing known devices to enroll to your tenant. So, we’ll soon expand enrollment restrictions to include Windows corporate identifiers. Autopilot device preparation will support the new corporate identifier enrollment feature <link to doc>. This added functionality will allow you to pre-upload device identifiers and ensure only trusted devices go through Autopilot device preparation. Stay tuned to What’s new in Intune for the release!
Frequently Asked Questions
How is this new Autopilot profile different from the current Autopilot profile?
The new Autopilot profile is a re-architecture of the current Autopilot profile so while the experience to OEMs, IT admins and users may look the same, the underlying architecture is very different. The updated architecture in the new Autopilot profiles gives the admin new capabilities that improve the deployment experience.
New orchestration agent allows the experience to fail fast and provide more error details.
Targeting is more precise and avoids dynamic changes when dynamic grouping is used.
Reporting infrastructure provides more details on the deployment experience.
Who does the new Autopilot profile benefit?
The new profile will benefit government customers who can now use Windows Autopilot device preparation to streamline their deployments at scale. It’ll also benefit new customers onboarding Windows Autopilot by reducing the complexity of setting up the deployment.
Is the new profile available in all sovereign clouds?
The new profile is available for Government Community Cloud (GCC) High and U.S. Department of Defense (DoD). It’s expected to be available for Intune operated by 21Vianet in China later this year.
What about the other Autopilot scenarios like pre-provisioning and self-deploying mode?
These functionalities will be supported in the future but aren’t part of the initial release.
Why is there a limit on the number of apps I can select to be delivered during OOBE?
We limited the number of applications that can be applied during OOBE to increase stability and achieve a higher success rate. Looking at our telemetry, almost 90% of all Autopilot deployments are deployed with 10 or fewer apps. This limit is intended to improve the overall user experience so that users can become more productive quickly. We understand that there are outliers and companies that want to target more during setup, but for the user-driven approach, we want to leverage the desktop experience for non-essential applications.
What is the order of installation for the device preparation profile?
The process is described in detail in: Overview for Windows Autopilot device preparation user-driven Microsoft Entra join in Intune.
Can we now mix app types such as LOB and Win32 apps with the device preparation profile?
While we always recommend Win32 apps, in current Autopilot deployments, mixing apps may result in errors. With the device preparation profile, we’ve streamlined the providers so different app types should not impact each other.
What is the guidance on user- vs device-based targeting?
Only device-based configurations will be delivered during OOBE. Assign security policy to devices, ensure all selected apps in the device preparation profile are set to install in system context, and are targeted to the device security group specified in the profile.
How will users know when the setup is complete?
Many users aren’t sure when the provisioning process is complete. To help mitigate confusion and calls to the help desk, we’re adding a completion page in OOBE. Admins can configure the pageto require a user to manually select to continue or set the page to auto-continue. This message will let the user know that OOBE setup is complete but there may be additional installations happening that they can monitor in the Intune Company Portal.
Can the new profile be used by other MDMs?
Windows Autopilot device preparation will support 3rd party MDMs. In this initial release, configuration is only possible via Intune.
Will this be available on Windows 10 devices?
Currently, device preparation profiles are only available on:
Windows 11, version 23H2 with KB5035942 or later.
Windows 11, version 22H2 with KB5035942 or later.
How can I move my existing devices to the new device preparation profile?
If you’d like to have an existing device join your tenant through the device preparation profile, the device would first need to be de-registered from Autopilot, then retargeted to a security group within your device preparation profile.
Do I need to migrate my existing profiles from Autopilot-to-Autopilot device preparation?
There’s no need to migrate from existing Autopilot to the new Autopilot profile. We expect both environments to exist in parallel for a while as we work to improve the experience and add more functionality.
Does this mean we are no longer investing in Autopilot?
Not at all! We’re continuing to work on Autopilot in parallel with developing Autopilot device preparation. The first release of Autopilot device preparation won’t have all the scenarios of Autopilot, specifically pre-provisioning and self-deploying modes, so we’ll continue to invest in those areas. Additionally, where possible, we plan to add any high value features from Autopilot device preparation to Autopilot to improve the experience for all customers.
If you have any questions, leave a comment below or reach out to us on X @IntuneSuppTeam. Stay tuned to What’s new in Intune and What’s new in Autopilot as we continue developing this new deployment experience.
Microsoft Tech Community – Latest Blogs –Read More
Announcing: Public Preview of Resubmit from an Action in Logic Apps Consumption Workflows
Announcing: Public Preview of Resubmit from an Action in Logic Apps Consumption Workflows
Introduction
We are happy to introduce Resubmit Action in the Consumption SKU. This is a feature that customers have been asking for a long time, and we are glad to finally deliver it to Consumption SKU. Resubmit action has been part of Standard SKU for a while, and we have gotten positive feedback from customers. Standard SKU announcement.
Background
Resubmit from trigger has been a feature available for many years, however customers are looking for more flexibility around being able to resubmit from any action within the workflow. This will give customers more control over where they resume their workflow from and will allow customers to avoid data duplication in action calls that previously were successful.
How it works
Once you select the action to be resubmitted, all actions before the selected action including the trigger are replayed from the original workflow run. This means we will reuse the inputs and outputs of those actions and not actually execute them. Once the workflow execution reaches the resubmitted action, we will process that action and all following actions as normal.
How to use it
We have improved the visibility of this operation since its first Standard SKU release. We listened to the input from our partners and customers and realized that the feature was too obscure and not easy to find. Therefore, we have expanded the number of locations where users can start the Resubmit operation.
Go to your workflow’s Run History page and select the run you want to resubmit.
Find the action you want to resubmit. Note: Failed and Successful actions can be resubmitted. There are two ways to resubmit from an action:
Option A
Right-click on the action and click the Submit from this action button.
Option B
Click on the action to bring up the run details. Near the top of the newly opened pane find and click on the Submit from this action button.
The page will refresh, putting you into the context of the new run. Note: Actions occurring before the resubmitted action will have a dim-green icon indicating their inputs and outputs were reused from the parent run.
Limitations
The resubmit actions feature is not available to all actions and workflow configurations. Below are the limitations to keep in mind when using this feature:
The workflow must be a Stateful workflow
The resubmitted run will execute the same flow version as the original run. This is true even if the workflow definition has been updated.
The workflow must have 40 or fewer actions to be eligible for action resubmit.
The workflow must be in a completed state e.g. Failed, Successful, Cancelled
Only actions of sequential workflows are eligible to be resubmitted. Workflows with parallel paths are currently not supported.
Actions inside of a Foreach or Until operations are not eligible to be resubmitted. Additionally, the Foreach and Until operations themselves are not eligible.
Actions that occur after Foreach and Until operations are not eligible to be resubmitted.
This feature is currently not available in VS Code or the Azure CLI.
What’s next
Give it a try! This is a highly requested feature from customers as there currently is no way to resubmit from an action inside of a Consumption workflow. This feature alleviates the need to re-run an entire workflow because of an external service failure or misconfiguration. Please give it a try and let us know your thoughts!
Microsoft Tech Community – Latest Blogs –Read More