Category: Microsoft
Category Archives: Microsoft
Almost everything you need to know to start with Microsoft Loop
Join the Microsoft Loop team tomorrow, Wednesday, August 21st at 10:00AM Pacific for the “Almost everything you need to know to start with Microsoft Loop” event.
During this event, the Loop experts will share what’s coming soon and how to get started with Microsoft Loop. They will highlight Loop components across various Microsoft 365 apps and share how to create and customize pages and workspaces within your favorite apps.
Join the Microsoft Loop team tomorrow, Wednesday, August 21st at 10:00AM Pacific for the “Almost everything you need to know to start with Microsoft Loop” event.
During this event, the Loop experts will share what’s coming soon and how to get started with Microsoft Loop. They will highlight Loop components across various Microsoft 365 apps and share how to create and customize pages and workspaces within your favorite apps.
Slides from Episode 156 of Microsoft Cloud and Hosting Partner Monthly Online Meeting now available
Hi everyone, please find here the content from the Cloud and Hosting Partner Online Meeting held on Tuesday August 20. In the session we covered highlights from various Commerce and Operations webinars held over the past 30 days, a discussion on questions raised last meeting and via the online form in segment called “Menagerie”, some of the changes in content from past meetings, general news in the last month, and some of the recent and forthcoming Skilling opportunities.
Enjoy the content and to those that joined in person … thank you.
Regards, Phil
Hi everyone, please find here the content from the Cloud and Hosting Partner Online Meeting held on Tuesday August 20. In the session we covered highlights from various Commerce and Operations webinars held over the past 30 days, a discussion on questions raised last meeting and via the online form in segment called “Menagerie”, some of the changes in content from past meetings, general news in the last month, and some of the recent and forthcoming Skilling opportunities.
Enjoy the content and to those that joined in person … thank you.
Regards, Phil Read More
But it says here Windows 11 Client Insider Preview – Build 26100.1150 English. not Windows 11 Client
And I need Windows 11 Client Insider Preview – Build 26100.1 English
And I need Windows 11 Client Insider Preview – Build 26100.1 English Read More
Sum of a column based on name and date entered on another tab
I am working with two tabs. Billing and Totals.
On the Billing tab – Column B is the date, Column G is the name (Sue), and the number to be summed is in column F.
On the Totals tab, B3 is 1/1/2024, C3 is 2/1/2024, and so on, ending at M3 12/1/2024.
“Sue” appears in the static position of A4 on the totals tab.
I need a formula on the Totals tab that will look at the Billing tab and give me the sum of the numbers entered into column F, IF the name in G is “Sue”, AND the date in B is the specified month. The formula will be copied over for each month and for each name. (I can make the changes to point at the right month and name once I have the actual formula to work with).
Hopefully this makes sense to someone. It barely makes sense to me.
I am working with two tabs. Billing and Totals.On the Billing tab – Column B is the date, Column G is the name (Sue), and the number to be summed is in column F.On the Totals tab, B3 is 1/1/2024, C3 is 2/1/2024, and so on, ending at M3 12/1/2024.”Sue” appears in the static position of A4 on the totals tab.I need a formula on the Totals tab that will look at the Billing tab and give me the sum of the numbers entered into column F, IF the name in G is “Sue”, AND the date in B is the specified month. The formula will be copied over for each month and for each name. (I can make the changes to point at the right month and name once I have the actual formula to work with).Hopefully this makes sense to someone. It barely makes sense to me. Read More
Volume Licensing Center is silently gone – welcome admin.microsoft.cloud
After a long time of transition, severely delayed by COVID-19, the old VLSC portal is now gone and replace by admin.microsoft.cloud for most of all volume licensing customers
View on VLSC webpage – no longer giving you access to a Vista like User Experience
I came across this when I was looking to edit permissions on licensing contracts for a customer.
The new and vastly improved experience of VLSC is found in
admin.microsoft.cloud > Billing > Your Products
example in admin.microsoft.cloud for permissions on contracts
These products include access to following licensing programs MPSA, EA, Open Value, Open, Open Value subscription, commonly known as “VL” or Volume Licensing.
while admin.microsoft.cloud > Billing > Licenses
Licenses in contrary to products contains all MCA, called Microsoft Customer Agreement or more commonly known as CSP.
You will see your licenses in admin.microsoft.cloud, granted you did the VLSC to Azure Portal transition in time, around 2022 through late 2023 and have obtained permissions correctly.
The new permission editor is a breeze! Same as the complete UX. Hats off Microsoft!
as per customer reports, most of this change went unnoticed. While there was communication via email for the beginning of transition and “to do” to setup “marry” VLSC with a busines account, nothing less than Entra and Azure, this change of the end of service was not communicated.
After a long time of transition, severely delayed by COVID-19, the old VLSC portal is now gone and replace by admin.microsoft.cloud for most of all volume licensing customers
View on VLSC webpage – no longer giving you access to a Vista like User Experience
I came across this when I was looking to edit permissions on licensing contracts for a customer.
The new and vastly improved experience of VLSC is found in admin.microsoft.cloud > Billing > Your Products
example in admin.microsoft.cloud for permissions on contracts
These products include access to following licensing programs MPSA, EA, Open Value, Open, Open Value subscription, commonly known as “VL” or Volume Licensing.
while admin.microsoft.cloud > Billing > Licenses Licenses in contrary to products contains all MCA, called Microsoft Customer Agreement or more commonly known as CSP.
You will see your licenses in admin.microsoft.cloud, granted you did the VLSC to Azure Portal transition in time, around 2022 through late 2023 and have obtained permissions correctly.
The new permission editor is a breeze! Same as the complete UX. Hats off Microsoft!
as per customer reports, most of this change went unnoticed. While there was communication via email for the beginning of transition and “to do” to setup “marry” VLSC with a busines account, nothing less than Entra and Azure, this change of the end of service was not communicated. Read More
Microsoft Office Product Key Question
Hello,
I purchased a Home & Student Product Key with my Microsoft user account. Is it possible to use the key with a different user account? I bought it as a gift for someone else, but when this person tries to enter the key, the app requests a log-in with my user account. Is there a way for me to share or gift this license to a different user?
Hello, I purchased a Home & Student Product Key with my Microsoft user account. Is it possible to use the key with a different user account? I bought it as a gift for someone else, but when this person tries to enter the key, the app requests a log-in with my user account. Is there a way for me to share or gift this license to a different user? Read More
Unable to log in to Poly desktop phone.
User reported that unable to log in to Poly desktop phone.
And keep getting prompted to enter the user’s password repeatedly.
User informed that Its a new phone deployed earlier this year.
User reported that unable to log in to Poly desktop phone.And keep getting prompted to enter the user’s password repeatedly.User informed that Its a new phone deployed earlier this year. Read More
How would I distribute a value among cells?
I have a sheet I’m using to calculate the number of tasks throughout the week. It is distributed based on hours worked each day. Mon-Thurs: 8.5 hours, Fri: 6 hours.
Right now I’m using a simple rounddown/up on the cells in the “Daily” row but as you can see below, they don’t always round properly so that the total at the end is the same as the actual amount I’m starting the day with.
How would I go about making sure they round properly so that the “Tasks:” each day equals the “Total” at the end of the row of days?
Any other improvements you all can think of to improve this would be appreciated. I feel like I’m getting better at this but
I have a sheet I’m using to calculate the number of tasks throughout the week. It is distributed based on hours worked each day. Mon-Thurs: 8.5 hours, Fri: 6 hours.Right now I’m using a simple rounddown/up on the cells in the “Daily” row but as you can see below, they don’t always round properly so that the total at the end is the same as the actual amount I’m starting the day with. How would I go about making sure they round properly so that the “Tasks:” each day equals the “Total” at the end of the row of days? Any other improvements you all can think of to improve this would be appreciated. I feel like I’m getting better at this but Read More
Introducing the MDTI Article Digest
The MDTI team is excited to introduce the MDTI Article Digest, a new way for customers to stay up to speed with the latest analysis of threat activity observed across more than 78 trillion daily threat signals from Microsoft’s interdisciplinary teams of experts worldwide. The digest, seamlessly integrated into the MDTI user interface in the threat intelligence blade of Defender XDR, shows users everything published since their last login:
Customers will see that not only does the digest notify users of the latest content but also encourages exploration through a user-friendly sidebar that lists the articles:
With the added convenience of pagination, users can now easily navigate through a wealth of information, ensuring they never miss valuable insights. The digest is also flexible, allowing users to clear notifications, thus tailoring the experience to their preferences.
The digest is a significant step forward in our commitment to delivering exceptional user experiences, and we’re excited to see how it will positively impact the MDTI community. If you’re a licensed MDTI user, login to Defender XDR today to see the digest located on the right-hand side of the UI, to the left of the TI Copilot embedded experience sidebar.
Conclusion
Microsoft delivers leading threat intelligence built on visibility across the global threat landscape made possible protecting Azure and other large cloud environments, managing billions of endpoints and emails, and maintaining a continuously updated graph of the internet. By processing an astonishing 78 trillion security signals daily, Microsoft can deliver threat intelligence in MDTI providing an all-encompassing view of attack vectors across various platforms, ensuring Sentinel customers have comprehensive threat detection and remediation.
If you are interested in learning more about MDTI and how it can help you unmask and neutralize modern adversaries and cyberthreats such as ransomware, and to explore the features and benefits of MDTI please visit the MDTI product web page.
Also, be sure to contact our sales team to request a demo or a quote. Learn how you can begin using MDTI with the purchase of just one Copilot for Security SCU here.
Microsoft Tech Community – Latest Blogs –Read More
A better Phi-3 Family is coming – multi-language support, better vision, intelligence MOEs
After the release of Phi-3 at Microsoft Build 2024, it has received different attention, especially the application of Phi-3-mini and Phi-3-vision on edge devices. In the June update, we improved Benchmark and System role support by adjusting high-quality data training. In the August update, based on community and customer feedback, we brought Phi-3-mini-128k-instruct multi-language support, Phi-3-vision-128k with multi-frame image input, and provided Phi-3 MOE newly added for AI Agent. Next, let’s take a look
Multi-language support
In previous versions, Phi-3-mini had good English corpus support, but weak support for non-English languages. When we tried to ask questions in Chinese, there were often some wrong questions, such as
But in the new version, we can have better understanding and corpus support with the new Chinese prediction support
Better vision
placeholder = “”
for i in range(1,22):
with open(“../output/keyframe_”+str(i)+”.jpg”, “rb”) as f:
images.append(Image.open(“../output/keyframe_”+str(i)+”.jpg”))
placeholder += f”<|image_{i}|>n”
Intelligence MOEs
Faster pre-training speed than dense models
Faster inference speed than models with the same number of parameters
Requires a lot of video memory because all expert systems need to be loaded into memory
There are many challenges in fine-tuning, but recent research shows that instruction tuning for mixed expert models has great potential.
“””
sys_msg = “””You are a helpful AI assistant, you are an agent capable of using a variety of tools to answer a question. Here are a few of the tools available to you:
– Blog: This tool helps you describe a certain knowledge point and content, and finally write it into Twitter or Facebook style content
– Translate: This is a tool that helps you translate into any language, using plain language as required
– Final Answer: the final answer tool must be used to respond to the user. You must use this when you have decided on an answer.
To use these tools you must always respond in JSON format containing `”tool_name”` and `”input”` key-value pairs. For example, to answer the question, “Build Muliti Agents with MOE models” you must use the calculator tool like so:
{
“tool_name”: “Blog”,
“input”: “Build Muliti Agents with MOE models”
}
Or to translate the question “can you introduce yourself in Chinese” you must respond:
{
“tool_name”: “Search”,
“input”: “can you introduce yourself in Chinese”
}
Remember just output the final result, ouput in JSON format containing `”agentid”`,`”tool_name”` , `”input”` and `”output”` key-value pairs .:
[
{ “agentid”: “step1”,
“tool_name”: “Blog”,
“input”: “Build Muliti Agents with MOE models”,
“output”: “………”
},
{ “agentid”: “step2”,
“tool_name”: “Search”,
“input”: “can you introduce yourself in Chinese”,
“output”: “………”
},
{
“agentid”: “final”
“tool_name”: “Result”,
“output”: “………”
}
]
The users answer is as follows.
“””
We can see that by telling the model what skills it needs to have and the arrangement of tasks, we can assign them to different tasks through Phi-3 MOEs to complete related work.
[
{
“agentid”: “step1”,
“tool_name”: “Blog”,
“input”: “Generative AI with MOE”,
“output”: “Generative AI with MOE (Mixture of Experts) is a powerful approach that combines the strengths of generative models and the flexibility of MOE architecture. This hybrid model can generate high-quality, diverse, and contextually relevant content, making it suitable for various applications such as content creation, data augmentation, and more.”
},
{
“agentid”: “step2”,
“tool_name”: “Translate”,
“input”: “Generative AI with MOE is a powerful approach that combines the strengths of generative models and the flexibility of MOE architecture. This hybrid model can generate high-quality, diverse, and contextually relevant content, making it suitable for various applications such as content creation, data augmentation, and more.”,
“output”: “基于生成AI的MOE(Mixture of Experts)是一种强大的方法,它结合了生成模型的优势和MOE架构的灵活性。这种混合模型可以生成高质量、多样化且上下文相关的内容,使其适用于各种应用,如内容创建、数据增强等。”
},
{
“agentid”: “final”,
“tool_name”: “Result”,
“output”: “基于生成AI的MOE(Mixture of Experts)是一种强大的方法,它结合了生成模型的优势和MOE架构的灵活性。这种混合模型可以生成高质量、多样化且上下文相关的内容,使其适用于各种应用,如内容创建、数据增强等。”
}
]
Thoughts on SLMs
Resources
Microsoft Tech Community – Latest Blogs –Read More
Visualizing Data as Graphs with Fabric and KQL
Introduction
For quite a while, I have been extremely interested in data visualization. Over the last few years, I have been focused on ways to visualize graph databases (regardless of where the data comes from Using force directed graphs to highlight the similarities or “connected communities” in data is incredibly powerful. The purpose of this post is to highlight the recent work that the Kusto.Explorer team has done to visualize graphs in Azure Data Explorer database with data coming from a Fabric KQL Database.
Note: The Kusto.Explorer application used to visualize the graph is currently only supported on Windows.
Background
Azure Data Explorer (ADX) is Microsoft’s fully managed, high-performance analytics engine specializing in near real time queries on high volumes of data. It is extremely useful for log analytics, time-series and Internet of Things type scenarios. ADX is like traditional relational database models in that it organizes the data into tables with strongly typed schemas.
In September 2023, the ADX team introduced extensions to the query language (KQL) that enabled graph semantics on top of the tabular data. These extensions enabled users to contextualize their data and its relationships as a graph structure of nodes and edges. Graphs are often an easier way to present and query complex or networked relationships. These are normally difficult to query because they require recursive joins on standard tables. Examples of common graphs include social networks (friends of friends), product recommendations (similar users also bought product x), connected assets (assembly line) or a knowledge graph.
Fast forward to February 2024, Microsoft Fabric introduced Eventhouse as a workload in a Fabric workspace. This brings forward the power of KQL and Real-Time analytics to the Fabric ecosystem.
So now, I have a large amount of data in Fabric Eventhouse that I want to visualize with a force directed graph…
Let’s get started!
Pre-Requisites
If you want to follow along, you will need a Microsoft Fabric account (Get started with Fabric for Free).
Next, for this post, I used an open dataset from the Bureau of Transportation Statistics. The following files were used:
Aviation Support Tables – Master Coordinate data
When you download this file, you can choose the fields to be included in it. For this example, I only used AirportID, Airport, AirportName, AirportCityName and AirportStateCode.
This Airport data will be loaded directly to a table in KQL.
This file does not necessarily need to be unzipped.
Airline Service Quality Performance 234 (On-Time performance data)
For this blog, I only used the “April 2024” file from this link.
This data will be accessed using a Lakehouse shortcut.
Unzip this file to a local folder and change the extension from “.asc” to “.psv” because this is a pipe-separated file.
In order to use these downloaded files, I uploaded them to the “Files” section of the Lakehouse in my Fabric Workspace. If you do not have a Lakehouse in your workspace, first, navigate to your workspace and select “New” -> “More Options” and choose “Lakehouse” from the Data Engineering workloads. Give your new Lakehouse a name and click “Create”.
Once you have a Lakehouse, you can upload the files by clicking on the Lakehouse to bring up the Lakehouse Explorer. First, in the Lakehouse Explorer, click the three dots next to “Files” and select “New subfolder” and create a folder for “Flights”. Next, click the three dots next to the “Flights” sub-folder and select “Upload” from the drop-down menu and choose the on-time performance file. Confirm that the file is uploaded to files by refreshing the page.
Next, an Eventhouse will be used to host the KQL Cluster where you will ingest the data for analysis. If you do not have an Eventhouse in your workspace, select “New” -> “More Options” and choose “Eventhouse” from “Real-Time Intelligence” workloads. Give your new Eventhouse a name and click “Create”.
Finally, we will use the Kusto.Explorer application (available only for Windows) to visualize the graph. This is a one-click deployment application, so it is possible that it will run an application update when you start it up.
Ingest Data to KQL Database
When the Eventhouse was created, a default KQL database with the same name was created. To get data into the database, click the three dots next to the database name, select “Get Data” -> “Local File”. In the dialog box that pops up, in the “Select or create a destination table”, click the “New table” and give the table a name, in this case it will be “airports”. Once you have a valid table name, the dialog will update to drag or browse for the file to load.
Note: You can upload files in a compressed file format if it is smaller than 1GB.
Click “Next” to inspect the data for import. For the airports data, you will need to change the “Format” to CSV and enable the option for “First row is column header”.
Click “Finish” to load the file to the KQL table.
The airport data should now be loaded into the table, and you can query the table to view the results.
Here is a sample of query to verify that data was loaded:
airports
| take 100;
For the On-Time performance data, we will not ingest it into KQL. Instead, we will create a shortcut to the files in the Lakehouse storage.
Back in the KQL Database explorer, at the top, click on the “+ New -> OneLake shortcut” menu item.
In the dialog that comes up, choose “Microsoft OneLake” and in the “Select a data source type”, choose the Lakehouse where the data was uploaded earlier, and click “Next”
Once the tree view of the OneLake populates the Tables and Files, open the files, and select the subfolder that was created when uploading the On-Time data, and click “Create” to complete the shortcut creation.
Once the shortcut is created, you can view that data by clicking the “Explore your data” and running the following query to validate your data.
external_table(‘flights’)
| count;
Note: When accessing the shortcut data, use the “external_table” and the name of the shortcut that was created. You cannot change the shortcut name.
Query and Visualize with Kusto.Explorer
Now that the data is connected to an Eventhouse database, we want to start to do analytics on this data. Fabric does have a way to run KQL Queries directly, but the expectation is that the results of the query will be a table. The only way to show the graph visualization is to use the Kusto.Explorer.
To connect to the KQL database, you need to get the URI of the cluster from Fabric. Navigating to the KQL Database in Fabric, there is a panel that includes the “Database details”.
Using the “Copy URI” to the right of the Query URI will copy the cluster URI to the clipboard.
In the Kusto.Explorer application, right click the “Connections” and select “Add Connection”
In the popup, paste the Query URI into the “Cluster connection” textbox replacing the text that is there. You can optionally give the connection an alias rather than using the URI. Finally, I chose to use the AAD for security. You can choose whatever is appropriate for your client access.
At this point, we can open a “New Tab” (Home menu) and type in the query like what we used above.
let nodes = airports;
let edges = external_table(‘flights’)
| project origin = Column7, dest = Column8, flight = strcat(Column1, Column2), carrier = Column1;
edges
| make-graph origin –> dest with nodes on AIRPORT
Note: You may need to modify the table names (airports, flights) depending on the shortcut or table name you used when loading the data. These values are case-sensitive.
The points of interest in our graph will be the airports (nodes) and the connections (edges) will be the individual flights that were delayed. I am using the “make-graph” extension in KQL to make a graph of edges from origin to destination using the three-character airport code as link.
Visualize with “make-graph”
When this query is run, if the last line of the query is “make-graph”, Kusto.Explorer will automatically pop up a new window titled “Chart” to view the data. In the image below, I chose to change the visualization to a dark theme and then colored the edges based on the “carrier” column of the flight data.
Note: I have zoomed in on the cluster of interest.
If I drag a few of the nodes around, I can start to see there are some nodes (airports) with a lot of orange connections. If I click on an orange link, I quickly learn the orange lines are Delta Flights and the three nodes I pulled out in the image below are Atlanta, Minneapolis, and Detroit.
Conclusion
I started with tables of text-based data and ended with a nice “network” visualization of my flights data. The power of graph visualization to see the relationships between my data rather than just reading tables is invaluable.
Next, I am excited to start to explore visualizations of the data for supply chains and product recommendations.
Microsoft Tech Community – Latest Blogs –Read More
isolated databricks cluster call from synapses or azure datafactory
how can I create a job in databricks with parameters of isolated from synapses or azure datafactory, because I can not find any option that allows to pass as parameter this value and not being able to do so I have no access to my unit catalog in databricks
example:
{
“num_workers”: 1,
“cluster_name”: “…”,
“spark_version”: “14.0.x-scala2.12”,
“spark_conf”: {
“spark.hadoop.fs.azure.account.oauth2.client.endpoint”: “…”,
“spark.hadoop.fs.azure.account.auth.type”: “…”,
“spark.hadoop.fs.azure.account.oauth.provider.type”: “…”,
“spark.hadoop.fs.azure.account.oauth2.client.id”: “…”,
“spark.hadoop.fs.azure.account.oauth2.client.secret”: “…”
},
“node_type_id”: “…”,
“driver_node_type_id”: “…”,
“ssh_public_keys”: [],
“spark_env_vars”: {
“cluster_type”: “all-purpose”
},
“init_scripts”: [],
“enable_local_disk_encryption”: false,
“data_security_mode”: “USER_ISOLATION”,
“cluster_id”: “…”
}
how can I create a job in databricks with parameters of isolated from synapses or azure datafactory, because I can not find any option that allows to pass as parameter this value and not being able to do so I have no access to my unit catalog in databricksexample:{ “num_workers”: 1, “cluster_name”: “…”, “spark_version”: “14.0.x-scala2.12”, “spark_conf”: { “spark.hadoop.fs.azure.account.oauth2.client.endpoint”: “…”, “spark.hadoop.fs.azure.account.auth.type”: “…”, “spark.hadoop.fs.azure.account.oauth.provider.type”: “…”, “spark.hadoop.fs.azure.account.oauth2.client.id”: “…”, “spark.hadoop.fs.azure.account.oauth2.client.secret”: “…” }, “node_type_id”: “…”, “driver_node_type_id”: “…”, “ssh_public_keys”: [], “spark_env_vars”: { “cluster_type”: “all-purpose” }, “init_scripts”: [], “enable_local_disk_encryption”: false, “data_security_mode”: “USER_ISOLATION”, “cluster_id”: “…”} Read More
MS ACCESS wont read Form Variables as query parameters
I use form variables as parameters in my queries. I have been doing this for 25 years in Access. Recently I have noticed that some of my queries are showing null values for form variables that are present on the form and I can see there is content in them. I can even see data in the immediate window when I reference the form fields in question, but doesn’t work anymore in the query.
What would be the reason that referencing a form variables in a query would not work (would not retain the data in the parameter that references a form variable)?
EX:
SELECT tblLoanTypes.LoanTypeCode, tblSDSTeam.ScheduledAttendancePerWeek, tblSDSTeam.StartDateThisAcademicYear, tblSDSTeam.EndDateThisAcademicYear, tblSDSTeam.AwardPeriodBeginDate, tblSDSTeam.AwardPeriodEndDate, tblSDSTeam.NumPaymentPeriodsInThisAward, tblSDSTeam.StudentProgramEnrollmentId, tblSDSTeam.StudentAcademicYearId, tblSDSTeam.StudentAwardId, tblSDSTeam.ExportDate, [Forms]![frmMain]![cRecId] AS Expr1, tblSDSTeam.ScheduledAwardAmount, tblSDSTeam.ScheduledAwardAmount, “R” AS Expr2, [Forms]![frmMain]![cSeq] AS Expr3, Left(Trim([StudentId]),9) AS Expr4, tblSDSTeam.AwardYear
FROM tblSDSTeam INNER JOIN tblLoanTypes ON tblSDSTeam.AwardType = tblLoanTypes.LoanType
WHERE (((tblSDSTeam.StudentProgramEnrollmentId)=[Forms]![frmMain]![nStudentProgramEnrollmentId]) AND ((tblSDSTeam.StudentAcademicYearId)=[Forms]![frmMain]![nStudentAcademicYearId]) AND ((tblSDSTeam.StudentAwardId)=[Forms]![frmMain]![nStudentAwardId]));
The query above worked fin until a month ago. But now all of the form variables are not making it to the query with data in them.
I compacted and repaired, re-created the tables and queries from scratch – with no luck. Any help is appreciated.
Microsoft® Access® for Microsoft 365 MSO (Version 2408 Build 16.0.17928.20066) 64-bit
Windows 11 pro with 32 GIG Ram
I use form variables as parameters in my queries. I have been doing this for 25 years in Access. Recently I have noticed that some of my queries are showing null values for form variables that are present on the form and I can see there is content in them. I can even see data in the immediate window when I reference the form fields in question, but doesn’t work anymore in the query. What would be the reason that referencing a form variables in a query would not work (would not retain the data in the parameter that references a form variable)? EX:SELECT tblLoanTypes.LoanTypeCode, tblSDSTeam.ScheduledAttendancePerWeek, tblSDSTeam.StartDateThisAcademicYear, tblSDSTeam.EndDateThisAcademicYear, tblSDSTeam.AwardPeriodBeginDate, tblSDSTeam.AwardPeriodEndDate, tblSDSTeam.NumPaymentPeriodsInThisAward, tblSDSTeam.StudentProgramEnrollmentId, tblSDSTeam.StudentAcademicYearId, tblSDSTeam.StudentAwardId, tblSDSTeam.ExportDate, [Forms]![frmMain]![cRecId] AS Expr1, tblSDSTeam.ScheduledAwardAmount, tblSDSTeam.ScheduledAwardAmount, “R” AS Expr2, [Forms]![frmMain]![cSeq] AS Expr3, Left(Trim([StudentId]),9) AS Expr4, tblSDSTeam.AwardYearFROM tblSDSTeam INNER JOIN tblLoanTypes ON tblSDSTeam.AwardType = tblLoanTypes.LoanTypeWHERE (((tblSDSTeam.StudentProgramEnrollmentId)=[Forms]![frmMain]![nStudentProgramEnrollmentId]) AND ((tblSDSTeam.StudentAcademicYearId)=[Forms]![frmMain]![nStudentAcademicYearId]) AND ((tblSDSTeam.StudentAwardId)=[Forms]![frmMain]![nStudentAwardId])); The query above worked fin until a month ago. But now all of the form variables are not making it to the query with data in them. I compacted and repaired, re-created the tables and queries from scratch – with no luck. Any help is appreciated. Microsoft® Access® for Microsoft 365 MSO (Version 2408 Build 16.0.17928.20066) 64-bitWindows 11 pro with 32 GIG Ram Read More
How to Custom Export Excel using power automate
Hi Everyone,
i have sharepoint list with multiple value
I want export to excel using power automate with custom column: **bleep**, Name, List App (deleted).
Flow used “create html table”
But the results obtained did not match expectations. I tried selecting 2 options: Chrome and Mozilla Firefox
is there something wrong with the flow? or do I have to add code to PowerApp? If you add code, what should you add?
Hi Everyone, i have sharepoint list with multiple value I want export to excel using power automate with custom column: **bleep**, Name, List App (deleted).Flow used “create html table” But the results obtained did not match expectations. I tried selecting 2 options: Chrome and Mozilla Firefoxis there something wrong with the flow? or do I have to add code to PowerApp? If you add code, what should you add? Read More
Swappable boot drives with Intune
I have a situation where multiple users get hard drives assigned to them to use in our classroom lab PCs that have drive trays where they insert their assigned drive. I have been testing how Intune handles the new drive being inserted.
When the first drive is built from the on-prem deployment system, it is recognized by Intune and all is well. When the second student arrives and builds their drive, all is well. The problem arises when the first student comes back to boot their drive. Intune flags it as non-compliant and will not pull new policy. They can still use the machine but Intune freaks out a bit. When the second drive comes back, all is well again in Intune.
I realize this is a strange scenario, but I thought someone might have a clever idea of how to get Intune to recognize both builds as compliant. I’m not sure if this is just the way it registers the hardware IDs or if I’m fighting a number of issues because this is not what it is designed for.
I have a situation where multiple users get hard drives assigned to them to use in our classroom lab PCs that have drive trays where they insert their assigned drive. I have been testing how Intune handles the new drive being inserted. When the first drive is built from the on-prem deployment system, it is recognized by Intune and all is well. When the second student arrives and builds their drive, all is well. The problem arises when the first student comes back to boot their drive. Intune flags it as non-compliant and will not pull new policy. They can still use the machine but Intune freaks out a bit. When the second drive comes back, all is well again in Intune. I realize this is a strange scenario, but I thought someone might have a clever idea of how to get Intune to recognize both builds as compliant. I’m not sure if this is just the way it registers the hardware IDs or if I’m fighting a number of issues because this is not what it is designed for. Read More
Azure OpenAI FedRAMP High + M365 Copilot Targeting Sept 2025 for GCC High/DOD
We’re excited to share two major updates for our public sector and defense customers:
Azure OpenAI Service is now FedRAMP High authorized for Azure Government. This approval allows government agencies to securely leverage advanced AI capabilities, including GPT-4o, within their Azure Government environment.
For the first time, we’re targeting a General Availability (GA) of September 2025 for Microsoft 365 Copilot in GCC High and DOD environments (pending government authorization). Copilot will deliver powerful AI tools tailored for decision-making, automation, and enhanced collaboration, all while meeting the strict compliance and security needs of our defense and government customers.
For more information on these updates and how they can impact your workflows, check out the full blog post
Let’s discuss how you’re planning to use these AI advancements in your environments!
3 business colleagues standing and talking
We’re excited to share two major updates for our public sector and defense customers:
Azure OpenAI Service is now FedRAMP High authorized for Azure Government. This approval allows government agencies to securely leverage advanced AI capabilities, including GPT-4o, within their Azure Government environment.
For the first time, we’re targeting a General Availability (GA) of September 2025 for Microsoft 365 Copilot in GCC High and DOD environments (pending government authorization). Copilot will deliver powerful AI tools tailored for decision-making, automation, and enhanced collaboration, all while meeting the strict compliance and security needs of our defense and government customers.
For more information on these updates and how they can impact your workflows, check out the full blog post
Let’s discuss how you’re planning to use these AI advancements in your environments!
Read More
Outlook
Hi, recently my Outlook account was empty, all folders, emails, and information were gone. Right now I am only receiving new emails. Nothing was deleted, it was just empty. How can I restore everything?
Thank you very much.
Hi, recently my Outlook account was empty, all folders, emails, and information were gone. Right now I am only receiving new emails. Nothing was deleted, it was just empty. How can I restore everything?Thank you very much. Read More
Microsoft Copilot for Microsoft 365 GCC GA Update
Exciting news for Federal Civilian and SLG customers! Microsoft Copilot for Microsoft 365 GCC is set for General Availability on October 15, 2024 (pending US Government authorization).
Key features coming in October:
AI-powered tools in Word, Excel, Outlook, and Teams
Graph-grounded chat for quick data access
Intelligent meeting recaps
March 2025 will bring even more capabilities like real-time meeting summaries and task automation.
Security & Compliance: Fully aligned with Microsoft 365 GCC standards.
:loudspeaker: Read the full blog for more details!
two business woman on laptops sitting and working side by side
Exciting news for Federal Civilian and SLG customers! Microsoft Copilot for Microsoft 365 GCC is set for General Availability on October 15, 2024 (pending US Government authorization).
Key features coming in October:
AI-powered tools in Word, Excel, Outlook, and Teams
Graph-grounded chat for quick data access
Intelligent meeting recaps
March 2025 will bring even more capabilities like real-time meeting summaries and task automation.
Security & Compliance: Fully aligned with Microsoft 365 GCC standards.
:loudspeaker: Read the full blog for more details! Read More
Feature Deep Dive: Export for OneDrive Sync Health Reports
We are excited to announce the Public Preview for exporting your Sync Health Reports data! This feature allows you to seamlessly integrate with other datasets like Azure Active Directory (AAD), Exchange, and Teams to create actionable insights and to automate your workflows.
What are the OneDrive Sync Health Reports?
When managing the health and data of hundreds or thousands of desktops in your organization, it can be challenging to know if your users are syncing their content to the cloud and that their data is protected. That’s where the Sync Health Reports come in.
The OneDrive Sync Health Reports dashboard provides insights into the health of the devices in your organization so you can proactively maintain your organization’s information and data. These health reports contain information for individual devices including if important folders are being backed up, if any sync errors have occurred, and if there are any health issues or advisories that need attention. These insights can help you easily address issues and ensure your users’ files are protected and synchronizing with the cloud.
How does export work for the OneDrive Sync Health Reports?
The data is exported via Microsoft Graph Data Connect, enabling seamless integration with other datasets such as Azure Active Directory (AAD), Exchange, and Teams data. This integration opens the door to actionable insights and automation that can transform how you manage OneDrive sync health across your organization.
Some of the valuable questions you can answer using the exported data are:
How many devices have opted into Known Folder Move (KFM)?
Which folders are most selected for Known Folder Move (KFM)?
What is the breakdown of unhealthy devices by OS version?
What is the breakdown of unhealthy devices by OneDrive Sync client version?
Is the device for user X reporting as healthy?
How many devices are showing errors?
Which types of errors are making most devices unhealthy?
Which devices are showing a specific error?
What are the errors occurring on a specific device?
Benefits at a Glance
Comprehensive insights and actionable data: Get a holistic view of sync health across all devices and also join with other datasets for in-depth analysis and actionable insights.
Enhanced monitoring: Detect spikes in errors, monitor Known Folder Move (KFM) rollout, and more.
Automation potential: Leverage the power of automation to streamline your OneDrive sync health management.
Getting Started
Ready to dive in? Here’s how you can get started with the new OneDrive Sync Health Data Export feature:
Set up the OneDrive sync health dashboard: Configure the devices in your organization to report device status. Learn more.
Set up Microsoft Graph Data Connect: Ensure you have the necessary permissions and setup for Microsoft Graph Data Connect.
Configure your Azure storage account: Make sure your Azure storage account is ready to receive the data.
Initiate the export: Use the OneDrive admin center or PowerShell to start exporting the sync health data.
Analyze and act: Once the data is in your Azure storage account, you can begin analyzing it and integrating it with other datasets for deeper insights.
For detailed instructions and support, visit our guide Step-by-step: OneDrive Sync Health.
We hope this new feature empowers you to manage OneDrive sync health more effectively and keep your organization’s data secure and synchronized. As always, we appreciate your feedback and look forward to hearing how you’re using this new capability.
Microsoft Tech Community – Latest Blogs –Read More
Comprehensive coverage and cost-savings with Microsoft Sentinel’s new data tier
As digital environments grow across platforms and clouds, organizations are faced with the dual challenges of collecting relevant security data to improve protection and optimizing costs of that data to meet budget limitations. Management complexity is also an issue as security teams work with diverse datasets to run on-demand investigations, proactive threat hunting, ad hoc queries and support long-term storage for audit and compliance purposes. Each log type requires specific data management strategies to support those use cases. To address these business needs, customers need a flexible SIEM (Security Information and Event Management) with multiple data tiers.
Microsoft is excited to announce the public preview of a new data tier Auxiliary Logs and Summary Rules in Microsoft Sentinel to further increase security coverage for high-volume data at an affordable price.
Auxiliary Logs supports high-volume data sources including network, proxy and firewall logs. Customers can get started today in preview with Auxiliary Logs today at no cost. We will notify users in advance before billing begins at $0.15 per Gb (US East). Initially Auxiliary Logs allow long term storage, however on-demand analysis is limited to the last 30 days. In addition, queries are on a single table only. Customers can continue to build custom solutions using Azure Data Explorer however the intention is that Auxiliary Logs cover most of those use-cases over time and are built into Microsoft Sentinel, so they include management capabilities.
Summary Rules further enhance the value of Auxiliary Logs. Summary Rules enable customers to easily aggregate data from Auxiliary Logs into a summary that can be routed to Analytics Logs for access to the full Microsoft Sentinel query feature set. The combination of Auxiliary logs and Summary rules enables security functions such as Indicator of Compromise (IOC) lookups, anomaly detection, and monitoring of unusual traffic patterns. Together, Auxiliary Logs and Summary Rules offer customers greater data flexibility, cost-efficiency, and comprehensive coverage.
Some of the key benefits of Auxiliary Logs and Summary Rules include:
Cost-effective coverage: Auxiliary Logs are ideal for ingesting large volumes of verbose logs at an affordable price-point. When there is a need for advanced security investigations or threat hunting, Summary Rules can aggregate and route Auxiliary Logs data to the Analytics Log tier delivering additional cost-savings and security value.
On-demand analysis: Auxiliary Logs supports 30 days of interactive queries with limited KQL, facilitating access and analysis of crucial security data for threat investigations.
Flexible retention and storage: Auxiliary Logs can be stored for up to 12 years in long-term retention. Access to these logs is available through running a search job.
Microsoft Sentinel’s multi-tier data ingestion and storage options
Microsoft is committed to providing customers with cost-effective, flexible options to manage their data at scale. Customers can choose from the different log plans in Microsoft Sentinel to meet their business needs. Data can be ingested as Analytics, Basic and Auxiliary Logs. Differentiating what data to ingest and where is crucial. We suggest categorizing security logs into primary and secondary data.
Primary logs (Analytics Logs): Contain data that is of critical security value and are utilized for real-time monitoring, alerts, and analytics. Examples include Endpoint Detection and Response (EDR) logs, authentication logs, audit trails from cloud platforms, Data Loss Prevention (DLP) logs, and threat intelligence.
Primary logs are usually monitored proactively, with scheduled alerts and analytics, to enable effective security detections.
In Microsoft Sentinel, these logs would be directed to Analytics Logs tables to leverage the full Microsoft Sentinel value.
Analytics Logs are available for 90 days to 2 years, with 12 years long-term retention option.
Secondary logs (Auxiliary Logs): Are verbose, low-value logs that contain limited security value but can help draw the full picture of a security incident or breach. They are not frequently used for deep analytics or alerts and are often accessed on-demand for ad-hoc querying, investigations, and search.
These include NetFlow, firewall, and proxy logs, and should be routed to Basic Logs or Auxiliary Logs.
Auxiliary logs are appropriate when using Log Stash, Cribl or similar for data transformation.
In the absence of transformation tools, Basic Logs are recommended.
Both Basic and Auxiliary Logs are available for 30 days, with long-term retention option of up to 12 years.
Additionally, for extensive ML, complex hunting tasks and frequent, extensive long-term retention customers have the choice of ADX. But this adds additional complexity and maintenance overhead.
Microsoft Sentinel’s native data tiering offers customers the flexibility to ingest, store and analyze all security data to meet their growing business needs.
Use case example: Auxiliary Logs and Summary Rules Coverage for Firewall Logs
Firewall event logs are a critical network log source for threat hunting and investigations. These logs can reveal abnormally large file transfers, volume and frequency of communication by a host, and port scanning. Firewall logs are also useful as a data source for various unstructured hunting techniques, such as stacking ephemeral ports or grouping and clustering different communication patterns.
In this scenario, organizations can now easily send all firewall logs to Auxiliary Logs at an affordable price point. In addition, customers can run a Summary Rule that creates scheduled aggregations and route them to the Analytics Logs tier. Analysts can use these aggregations for their day-to-day work and if they need to drill down, they can easily query the relevant records from Auxiliary Logs. Together Auxiliary Logs and Summary Rules help security teams use high volume, verbose logs to meet their security requirements while minimizing costs.
Figure 1: Ingest high volume, verbose firewall logs into an Auxiliary Logs table.
Figure 2: Create aggregated datasets on the verbose logs in Auxiliary Logs plan.
Customers are already finding value with Auxiliary Logs and Summary Rules as seen below:
“The BlueVoyant team enjoyed participating in the private preview for Auxiliary logs and are grateful Microsoft has created new ways to optimize log ingestion with Auxiliary logs. The new features enable us to transform data that is traditionally lower value into more insightful, searchable data.”
Mona Ghadiri
Senior Director of Product Management, BlueVoyant
“The Auxiliary Log is a perfect fusion of Basic Log and long-term retention, offering the best of
both worlds. When combined with Summary Rules, it effectively addresses various use cases for ingesting large volumes of logs into Microsoft Sentinel.”
Debac Manikandan
Senior Cybersecurity Engineer, DEFEND
Looking forward
Microsoft is committed to expanding the scenarios covered by Auxiliary Logs over time, including data transformation and standard tables, improved query performance at scale, billing and more. We are working closely with our customers to collect feedback and will continue to add more functionality. As always, we’d love to hear your thoughts.
Learn more
Log retention plans in Microsoft Sentinel | Microsoft Learn
Plan costs and understand pricing and billing – Microsoft Sentinel | Microsoft Learn
What’s new in Microsoft Sentinel | Microsoft Learn
Reduce costs for Microsoft Sentinel | Microsoft Learn
When to use Auxiliary Logs in Microsoft Sentinel | Microsoft Learn
Aggregate Microsoft Sentinel data with summary rules | Microsoft Learn
Microsoft Sentinel Pricing | Microsoft Azure
Microsoft Tech Community – Latest Blogs –Read More