Tag Archives: microsoft
How to Set Up and Use Quick-B00ks Remote Access Tool?
Hi everyone, I’m looking for guidance on setting up the Quick-B00ks Remote Access Tool. I need to manage my business’s finances while traveling and ensure secure access to our Quick-B00ks data. Could someone explain the steps to configure it and any best practices for using this tool effectively? Are there specific security measures I should be aware of? Thanks in advance for your help!
Hi everyone, I’m looking for guidance on setting up the Quick-B00ks Remote Access Tool. I need to manage my business’s finances while traveling and ensure secure access to our Quick-B00ks data. Could someone explain the steps to configure it and any best practices for using this tool effectively? Are there specific security measures I should be aware of? Thanks in advance for your help! Read More
Q-B Pay-roll Update Error 15270 – Need Assistance
Hi Quick-B00ks Support Team,
I am encountering an issue with my Quick-B00ks Pay_roll update. Every time I try to update, I receive the following error message:
Error 15270: The (pay_roll) update did not complete successfully. The update is missing a file.
I’ve attempted several solutions, including restarting Quick-B00ks, ensuring my pay_roll subscription is active, and checking my internet connection. However, the problem persists.
Could you please provide detailed steps to resolve this error? Any help would be greatly appreciated.
Thank you in advance!
Hi Quick-B00ks Support Team,I am encountering an issue with my Quick-B00ks Pay_roll update. Every time I try to update, I receive the following error message:Error 15270: The (pay_roll) update did not complete successfully. The update is missing a file.I’ve attempted several solutions, including restarting Quick-B00ks, ensuring my pay_roll subscription is active, and checking my internet connection. However, the problem persists.Could you please provide detailed steps to resolve this error? Any help would be greatly appreciated.Thank you in advance! Read More
Azure Database for MySQL – June 2024 updates and latest feature roadmap
We’re excited to share a summary of the Azure Database for MySQL – Flexible Server announcements from last month, as well as the latest roadmap of upcoming features!
July 2024 Live webinar
These updates and the latest roadmap are also covered in our Monthly Live Webinar on YouTube (Click here to subscribe to our YouTube channel!), which streams the second Wednesday of every month, at 7:30 AM Pacific time. Below is a link to the session recording of the live webinar we delivered last week:
June 2024 updates and announcements
Move from private access (VNet integrated) connectivity method to public access / private endpoint
We’re thrilled to announce a new feature that allows you to transition from Private Access, that is, Virtual Network Integrated connectivity method to Public Access or Private Link connectivity method.
Previously with Azure Database for MySQL, this switch was prohibited, and you were expected to recreate a server in the Public infrastructure and migrate to it. Now, you can simply select the “Move to Private Link” option in the “Networking” pane, and go through a 2-step wizard which detaches the VNet and allows you to either establish a Private Link or enable public access. The transition process is simple and seamless, and avoids the need to alter server names or migrate data.
Learn more: Concepts | Tutorial
Latest feature roadmap
Feature
Description
Release status
Coming soon!
(Tentative*)
On-demand backup and export
This feature provides you with the ability to export at-moment physical backup of the server to an Azure storage account (Azure blob storage) using an Azure CLI command. After export, these backups can be used for data recovery, migration, data redundancy and availability, or auditing. Learn more.
Public Preview
General Availability
in Q3 CY24
Flexible maintenance options
Building upon our existing system-managed and custom-managed maintenance windows, the following new flexible maintenance options aim to elevate user convenience and operational flexibility in server maintenance:
Reschedule window: Tailor maintenance schedules to suit your business rhythm.
On-demand maintenance: Instantly initiate maintenance activities using the Reschedule now option.
Public Preview
General Availability
in Q3 CY24
Near-zero downtime maintenance for HA servers
This feature is designed to substantially reduce maintenance downtime for HA-enabled servers, ensuring that in most cases, maintenance downtime is expected to be between 40 to 60 seconds. This capability is pivotal for businesses that demand high availability and minimal interruption in their database operations. Learn more.
Public Preview
General Availability
in Q3 CY24
Virtual Canary
The Virtual Canary feature is an exciting solution for Azure MySQL users who prioritize staying at the forefront of technology by making sure that their servers always run the most current version. Servers opted in for virtual canary receive maintenance updates earlier in advance. You can also take advantage of the feature as an opportunity to perform an additional layer of update testing on your dev, test, or staging servers to help avoid workload-specific issues like application-level compatibility issues. The feature thus offers an efficient way to manage updates, align testing and production environments, and maintain operational stability with minimal disruption.
–
Public Preview
in Q3 CY24
MySQL Discovery & Assessment in Azure Migrate
With this functionality, you can use Azure Migrate to discover MySQL servers in your environment, assess them by identifying their compatibility for moving to Azure Database for MySQL, and receive compute and storage SKU recommendations along with their costs. Learn more.
Private Preview
Public Preview
in Q3 CY24
Long Term Retention of Backups
Previously with Azure Database for MySQL, you could retain automated backups and on-demand backups for up to 35 days. With Long Term Retention, you can now retain the backups up to 10 years, further accommodating your audit and compliance needs. Learn more.
Public Preview
General Availability
in Q4 CY24
Error Logs (in Server Logs)
This feature allows you to maintain MySQL error log files under Server logs and download them for up to seven days. These error logs can help you efficiently identify and troubleshoot performance and reliability issues, and proactively detect and respond to unauthorized access attempts, failed login attempts, and other security-related events. Learn more.
Public Preview
General Availability
in Q4 CY24
CMK-enabled support for Accelerated Logs
Accelerated Logs, available with the Business Critical service tier and designed for mission-critical workloads, is a feature provides an increase in throughput of up to two times (2x) for your applications at no additional cost. The feature will soon be supported on servers that have Customer Managed Keys (CMK) enabled.
–
General Availability
in Q4 CY24
*The roadmap features and dates are tentative and subject to changes. Please stay tuned for continuous updates.
Conclusion
As we continue to work on new features and functionalities, your feedback is very critical for our improvement. If you wish to enrol in Private Preview for any of the above features, or if you have any suggestions for or queries about our service, email us at AskAzureDBforMySQL@service.microsoft.com.
To learn more about what’s new with Flexible Server, see What’s new in Azure Database for MySQL – Flexible Server. Stay tuned for more updates and announcements by following us on social media: YouTube | LinkedIn | X.
Microsoft Tech Community – Latest Blogs –Read More
Verify the integrity of Azure Confidential Ledger transactions with receipts and application claims
In today’s digital landscape, the integrity and confidentiality of transactional data are paramount. Microsoft’s Azure Confidential Ledger offers a robust solution for maintaining the privacy and confidentiality of your data. The service utilizes cryptographic techniques to generate transaction receipts, which serve as immutable evidence of the ledger’s state at a specific point in time. These receipts are crucial for businesses that require a high level of trust and transparency in their operations.
Write receipts
The value proposition of Azure Confidential Ledger write receipts lies in their ability to provide a verifiable trail of all write transactions. Azure Confidential Ledger leverages the Confidential Consortium Framework (CCF), which ensures the integrity of transactions by using a Merkle tree data structure to store the hash of all transaction blocks that are added to the immutable ledger.
How write transactions are recorded in the ledger using an internal Merkle Tree data structure in CCF.
When a write transaction is completed, Azure Confidential Ledger users can obtain a cryptographic Merkle proof, or receipt, over the entry created in a Confidential Ledger to check that the write operation was recorded correctly. A write transaction receipt is evidence that the system has committed the corresponding transaction and can be used to confirm that the entry has been successfully appended to the ledger. This ensures that once a transaction has been committed to the ledger, it cannot be altered or deleted without detection.
For more details on Azure Confidential Ledger write receipts, their structure, and how to get a receipt from an active ledger, please refer to this dedicated article.
Application claims
Application claims take receipts a step further by allowing users to attach arbitrary metadata to a transaction, which are eventually reflected in write receipt response payloads. This metadata includes details specific to the transaction’s context, such as the collection ID and the input content of a write operation. The application claims of a write transaction ensure that the claims digest is signed securely and stored together with the transaction itself, meaning that it cannot be tampered with once the transaction is committed.
Example of an application claim attached to a write receipt response payload.
Later, the application claims in plain format are shown in the receipt payload for the same transaction where they were added. Using the claims in plain format, users can recalculate the same claims digest (available in the write receipt) that the ledger signed in place during the transaction to verify the claim authenticity. The claims digest can help verify the write transaction receipt, giving an offline way for users to check the authenticity of the recorded claims.
By leveraging application claims, organizations can tailor the ledger to their specific needs, enhancing the utility and relevance of the data stored within receipts. Application claims are currently supported in the Azure Confidential Ledger preview API version 2023-01-18-preview and their current format is documented in this article.
Receipts and claims verification
The process of verifying write transaction receipts and application claims is straightforward and secure. Utilizing cryptographic proofs, users can independently confirm the authenticity and integrity of each transaction offline, without having to connect to the ledger or trust any central authority.
The Azure Confidential Ledger client library for Python offers useful functions to validate receipts of write transactions and calculate the claims digest from a list of application claims in an easy and seamless manner. With this verification utility, any write receipt from a Confidential Ledger service can be verified with ease and any application claims associated with the transaction can be fully authenticated.
from azure.identity import DefaultAzureCredential
from azure.confidentialledger import ConfidentialLedgerClient
from azure.confidentialledger.certificate import (
ConfidentialLedgerCertificateClient,
)
from azure.confidentialledger.receipt import (
verify_receipt,
)
LEDGER_ID = “acl-test-ledger” # Replace with the ID of the ledger to get the receipt from.
TRANSACTION_ID = “2.50” # Replace with the ID of the transaction to get the receipt for.
API_VERSION = “2023-01-18-preview” # Use this API version for application claims support.
# Build a ConfidentialLedgerClient object through AAD.
ledger_client = ConfidentialLedgerClient(
f”https://{LEDGER_ID}.confidential-ledger.azure.com”,
credential=DefaultAzureCredential(),
ledger_certificate_path=”service_cert.pem”,
api_version=API_VERSION,
)
### We assume that the target transaction has been committed to the ledger in a previous step.
### Please refer to the Azure Confidential Ledger Python SDK samples and documentation
### for details on how to create an entry and wait for it to be committed.
# Get a receipt from the ledger for the input transaction.
poller = ledger_client.begin_get_receipt(TRANSACTION_ID)
get_receipt_response = poller.result()
print(get_receipt_response)
try:
# Verify the contents of the receipt, with optional application claims (if any)
verify_receipt(
get_receipt_response[“receipt”],
ConfidentialLedgerCertificateClient().get_ledger_identity(LEDGER_ID).get(“ledgerTlsCertificate”),
application_claims=get_receipt_response.get(“applicationClaims”, None),
)
print(f”Receipt for transaction id {TRANSACTION_ID} successfully verified”)
except ValueError:
print(f”Receipt verification for transaction id {TRANSACTION_ID} failed”)
raise
How to verify receipts (with optional application claims) using the Azure Confidential Ledger Python SDK.
The decentralized and offline approach to verification bolsters the security and reliability of the system, making Azure Confidential Ledger an ideal platform for applications that demand the highest levels of data integrity. To learn more about the Data Plane Python SDK and its receipt verification utilities, check out this section and the full sample code.
Conclusion
In conclusion, Azure Confidential Ledger’s receipts and application claims offer a compelling value proposition for organizations looking to secure their transactional data. With its strong focus on integrity, confidentiality, and verifiability, Azure Confidential Ledger stands out as a leading solution in the realm of confidential computing. Whether you are managing financial transactions, supply chain management, or any other data-sensitive operation, Azure Confidential Ledger provides the assurance that your data remains untampered and trustworthy through transaction receipts and application claims.
Resources
For getting started with Azure confidential ledger write receipts and application claims, please refer to our documentation:
Azure Confidential Ledger write transaction receipts | Microsoft Learn
Verify Azure Confidential Ledger write transaction receipts | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
Powershell – Set-VpnConnectionIPsecConfiguration : Invalid namespace
The default Windows built in L2TP client uses 3DES, and for VPN connection I need to use AES to AES256, so I found command to use Powershell to edit a connection:
Set-VpnConnectionIPsecConfiguration -ConnectionName L2TP -AuthenticationTransformConstants SHA196 -CipherTransformConstants AES128 -DHGroup Group14 -EncryptionMethod AES128 -IntegrityCheckMethod SHA1 -PfsGroup PFS2048 -Force
This does work on a test machine, I was able to connect to a Cisco ASA. My problem is on the PC that needs VPN connections I get this error:
Set-VpnConnectionIPsecConfiguration : Invalid namespace
At line:1 char:1
+ Set-VpnConnectionIPsecConfiguration -ConnectionName L2TP -Authenticat …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : MetadataError: (PS_VpnConnectionIPsecConfiguration:root/Microsoft/…ecConfiguration) [Set-VpnConnectionIPsecConfiguration], CimException
+ FullyQualifiedErrorId : HRESULT 0x8004100e,Set-VpnConnectionIPsecConfiguration
Not much of a Windows user, even less so with Powershell. Googling ‘Invalid namespace’ to ‘CIMexception’ has not gotten me anywhere, and I have yet to find anything specific to someone else encountering this error when modifying VPN connection. I also get this Invalid namespace when trying to add a new connection via Powershell as well.
The default Windows built in L2TP client uses 3DES, and for VPN connection I need to use AES to AES256, so I found command to use Powershell to edit a connection:Set-VpnConnectionIPsecConfiguration -ConnectionName L2TP -AuthenticationTransformConstants SHA196 -CipherTransformConstants AES128 -DHGroup Group14 -EncryptionMethod AES128 -IntegrityCheckMethod SHA1 -PfsGroup PFS2048 -ForceThis does work on a test machine, I was able to connect to a Cisco ASA. My problem is on the PC that needs VPN connections I get this error:Set-VpnConnectionIPsecConfiguration : Invalid namespace
At line:1 char:1
+ Set-VpnConnectionIPsecConfiguration -ConnectionName L2TP -Authenticat …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : MetadataError: (PS_VpnConnectionIPsecConfiguration:root/Microsoft/…ecConfiguration) [Set-VpnConnectionIPsecConfiguration], CimException
+ FullyQualifiedErrorId : HRESULT 0x8004100e,Set-VpnConnectionIPsecConfigurationNot much of a Windows user, even less so with Powershell. Googling ‘Invalid namespace’ to ‘CIMexception’ has not gotten me anywhere, and I have yet to find anything specific to someone else encountering this error when modifying VPN connection. I also get this Invalid namespace when trying to add a new connection via Powershell as well. Read More
External hard drive suddenly “unlabeled volume 1”
I was getting the error “The Recycle Bin on D: is corrupted” (external). Then, when I unplugged it and plugged it back in, it was unlabelled, and I can’t access the files anymore. The error message says, “This volume does not contain a recognized file system.” Yes, the disk shows good health on CrystalDisk, but it’s asking me to format it.
Any fix besides formatting so i can access my files again?
I was getting the error “The Recycle Bin on D: is corrupted” (external). Then, when I unplugged it and plugged it back in, it was unlabelled, and I can’t access the files anymore. The error message says, “This volume does not contain a recognized file system.” Yes, the disk shows good health on CrystalDisk, but it’s asking me to format it. Any fix besides formatting so i can access my files again? Read More
Is it possible to search the “Show Changes” results in Excel 365?
I get so many results, and I’d like to search them. I see the filter offering a cell or range or sheet. I’d like to see if there are changes from specific people.
I get so many results, and I’d like to search them. I see the filter offering a cell or range or sheet. I’d like to see if there are changes from specific people. Read More
Macro to Sort two columns
I need an Excel Macro for sorting.
I have two columns with data. A contains addresses and B their grid reference. I add new addresses and their Grid references after the last currently used row. I then sort ascending, on column A. I have the following code from using Libre Office. what needs changing to run in Excel. I think I understand this code: On the Active sheet, select cell A3, then find the last used row in Column A. Now sort on column A. set range to A3:B(last Row in column a), do not sort row 3 as this is the header, accept any case, sort top to bottom. Do not need “SortMethod = xlPinYin” as only English characters; do I need another method. Lastly Apply sort. NOTE the macro will be run from a Push Button on the Grid_)references sheet.
Can anyone please advise?
I guess the Excel code needs:-
Select correct worksheet: Sheets(“Grid_References”).Select
Select Range: e.g Cell A3:B(Last used A Row): ActiveSheet.Range(“a10000”).End(xlUp).Row
Sort on column A: Columns(“A:B”).Sort key1:=Range(“A3:B Last A Row”), order1:=xlAscending, Header:=xlYes
Libre Offfice macro:-
RRem Attribute VBA_ModuleType=VBAModule
Option VBASupport 1
‘Sorting Named Range A-Column-Wise in Ascending Order
Sub Grid_References()
‘## 24/11/2023 ##
Set ws = ActiveSheet
Dim rowOne, L
rowOne = 3 ‘data from row 3 / headings
L = ws.Cells(Rows.Count, “A”).End(xlUp).Row
With ws.Sort
.SortFields.Clear
.SortFields.Add Key:=ws.Range(ws.Cells(rowOne, “A”), ws.Cells(L, “A”)), SortOn:=xlSortOnValues, Order:=xlAscending, DataOption:=xlSortNormal
.SetRange ws.Range(ws.Cells(rowOne, “A”), ws.Cells(L, “B”))
.Header = xlYes
.MatchCase = False
.Orientation = xlTopToBottom
.SortMethod = xlPinYin
.Apply
End With
End Sub
I need an Excel Macro for sorting. I have two columns with data. A contains addresses and B their grid reference. I add new addresses and their Grid references after the last currently used row. I then sort ascending, on column A. I have the following code from using Libre Office. what needs changing to run in Excel. I think I understand this code: On the Active sheet, select cell A3, then find the last used row in Column A. Now sort on column A. set range to A3:B(last Row in column a), do not sort row 3 as this is the header, accept any case, sort top to bottom. Do not need “SortMethod = xlPinYin” as only English characters; do I need another method. Lastly Apply sort. NOTE the macro will be run from a Push Button on the Grid_)references sheet. Can anyone please advise? I guess the Excel code needs:- Select correct worksheet: Sheets(“Grid_References”).SelectSelect Range: e.g Cell A3:B(Last used A Row): ActiveSheet.Range(“a10000”).End(xlUp).RowSort on column A: Columns(“A:B”).Sort key1:=Range(“A3:B Last A Row”), order1:=xlAscending, Header:=xlYes Libre Offfice macro:- RRem Attribute VBA_ModuleType=VBAModuleOption VBASupport 1’Sorting Named Range A-Column-Wise in Ascending OrderSub Grid_References()’## 24/11/2023 ##Set ws = ActiveSheetDim rowOne, LrowOne = 3 ‘data from row 3 / headingsL = ws.Cells(Rows.Count, “A”).End(xlUp).RowWith ws.Sort.SortFields.Clear.SortFields.Add Key:=ws.Range(ws.Cells(rowOne, “A”), ws.Cells(L, “A”)), SortOn:=xlSortOnValues, Order:=xlAscending, DataOption:=xlSortNormal.SetRange ws.Range(ws.Cells(rowOne, “A”), ws.Cells(L, “B”)).Header = xlYes.MatchCase = False.Orientation = xlTopToBottom.SortMethod = xlPinYin.ApplyEnd WithEnd Sub Read More
Detect horizontal / vertical port scans
Hi everyone,
i recently installed Greenbone OpenVAS and performed a port scan in the servers subnet (all have Defender installed). I would have expected an alert but .. nothing. Just an IIS server had some bad logins.
I then hunted for the remote IP and used this query
DeviceNetworkEvents
| where Timestamp > ago(1d) and RemoteIP startswith “172.20.100.100”
| summarize
by RemoteIP, DeviceName, RemotePort
| summarize RemotePortCount=dcount(RemotePort) by DeviceName, RemoteIP
Got 31 hosts back where Greenbone connected to within 1h.
Is there a detection for this anyway? And if yes – how high is the threshold?
BR
Stephan
Hi everyone, i recently installed Greenbone OpenVAS and performed a port scan in the servers subnet (all have Defender installed). I would have expected an alert but .. nothing. Just an IIS server had some bad logins. I then hunted for the remote IP and used this queryDeviceNetworkEvents
| where Timestamp > ago(1d) and RemoteIP startswith “172.20.100.100”
| summarize
by RemoteIP, DeviceName, RemotePort
| summarize RemotePortCount=dcount(RemotePort) by DeviceName, RemoteIP Got 31 hosts back where Greenbone connected to within 1h. Is there a detection for this anyway? And if yes – how high is the threshold? BRStephan Read More
Numbering Only Visible Rows in Excel
Hello everyone,
I am looking to number only the visible rows in my Excel table using an Excel function, starting from “1” and leaving a blank string for the hidden rows. The numbering should be sequential and ignore the hidden rows. I would like this numbering to be done in the column titled “Line Number.”
Thank you in advance for your help!
Hello everyone,I am looking to number only the visible rows in my Excel table using an Excel function, starting from “1” and leaving a blank string for the hidden rows. The numbering should be sequential and ignore the hidden rows. I would like this numbering to be done in the column titled “Line Number.”Thank you in advance for your help! Read More
Teams defaults to expired account
I’m a freelance professional and as such I’ve had to use differing accounts to log into Microsoft Teams.
I have a meeting this afternoon only my tablet for use of Teams.
However I am selecting the account I wish to use, yet Teams is repeatedly trying to log me in with the email address of an expired account.
I have repeatedly selected the account I require and entered the correct login details, yet Team immediately moves to the log in screen with the email address of the expired account.
I select login with a new account, and the loop starts again.
How can I make Team accept the details of the account I have correctly entered?
Thanks
I’m a freelance professional and as such I’ve had to use differing accounts to log into Microsoft Teams. I have a meeting this afternoon only my tablet for use of Teams. However I am selecting the account I wish to use, yet Teams is repeatedly trying to log me in with the email address of an expired account. I have repeatedly selected the account I require and entered the correct login details, yet Team immediately moves to the log in screen with the email address of the expired account. I select login with a new account, and the loop starts again. How can I make Team accept the details of the account I have correctly entered? Thanks Read More
Azure Backup for laptop backup
Hi,
is it possible to create a complete backup of a laptop using Azure? I have 2 critical laptops that I need to be able to back up and restore if case of laptop failure. This means, restore on another laptop with all installed apps and setting.
Thank you.
Hi, is it possible to create a complete backup of a laptop using Azure? I have 2 critical laptops that I need to be able to back up and restore if case of laptop failure. This means, restore on another laptop with all installed apps and setting. Thank you. Read More
Nonprofit eligibility review still pending after 30 days
Hi all,
we have applied for Microsoft 365 Business Premium for Nonprofit organization, but the review is still pending 30 days after the registration.
I have searching for a contact to write to and check if there is a problem but it’s impossible to find one on the Contact page. Only FAQ and articles there and none of those help in this situation.
Any idea what can it be done?
Thanks,
Gorast
Hi all,we have applied for Microsoft 365 Business Premium for Nonprofit organization, but the review is still pending 30 days after the registration. I have searching for a contact to write to and check if there is a problem but it’s impossible to find one on the Contact page. Only FAQ and articles there and none of those help in this situation.Any idea what can it be done? Thanks,Gorast Read More
Share App Content To Stage
Hi,
I am using the Teams SDK to share my content to the stage and have set the shareOptions to ScreenShare, intending for only the Presenter to interact with it. However, despite this configuration, participants are still able to interact with the content. Can you help me resolve this issue?microsoftTeams.meeting.shareAppContentToStage((err, result) => { }, appContentUrl,
{
sharingProtocol: microsoftTeams.meeting.SharingProtocol.ScreenShare
}
);
Hi,I am using the Teams SDK to share my content to the stage and have set the shareOptions to ScreenShare, intending for only the Presenter to interact with it. However, despite this configuration, participants are still able to interact with the content. Can you help me resolve this issue?microsoftTeams.meeting.shareAppContentToStage((err, result) => { }, appContentUrl,{sharingProtocol: microsoftTeams.meeting.SharingProtocol.ScreenShare}); Read More
Accelerate data democratization in era of generative AI using Denodo Platform and Microsoft Fabric
In this guest blog post, Mitesh Shah, Director of Cloud Product Management, GTM, Alliances at Denodo Technologies, explains how the Denodo Platform along with Microsoft Fabric enables a robust data integration, data management, and data delivery platform in hybrid and multi-cloud environments. Learn how you can extend your data analytics and generative AI use cases and deliver smart data management in real time.
Data landscape and business challenge
Enterprises deal with massive amounts of data that have become the backbone for advanced analytics and generative AI projects. As the data volume continues to grow, it is often fragmented and siloed across multiple data sources, waiting to be analyzed. At the same time, the emergence of large language models (LLMs) and generative AI mark a significant leap in technology, promising to deliver transformational automation and innovation across diverse industries and use cases.
The most common challenge in enterprise data management is multiple lines of businesses operating silos of data that are not truly connected. It is difficult to find deep and accurate insights without a single source of truth. Stitching together unique analytics tools across organizations is complicated. Costs associated with procuring and managing these capabilities can be exorbitant. And there is a significant risk associated with lack of governance.
At the same time, generative AI relies on LLMs, which have inherent limitations around their training data. These LLMs lack insights about enterprise-wide data, thus limiting operational use cases tied to real-time reporting and decision making. This impacts generative AI-led data management because of inaccurate and inconsistent results from LLMs. The results are end-user mistrust, regulatory violations around ethical use of AI, and issues with security and privacy compliance in the data management landscape.
The Denodo Platform and Microsoft Fabric to the rescue
Most enterprises have established a centralized data and analytics (D&A) center of excellence to support federated D&A initiatives and prevent enterprise failure. Data is a critical component of these D&A centers, which have become a key priority for organizations.
Supporting well-known corporate architectures such as data fabric and data mesh, the Denodo Platform, available in the Microsoft Azure Marketplace, offers a robust framework and provides a semantic layer for data management, abstracting data from end users while democratizing access across multiple tools and services. In a similar fashion, Microsoft Fabric brings together the best parts of data mesh and data fabric to provide a one-stop shop for data integration, data engineering, real-time analytics, data science, and business intelligence needs without compromising data privacy and security.
Microsoft Fabric combines Azure Data Factory, Azure Synapse Analytics, Data Explorer, and Power BI into a unified experience in the cloud. The open and governed data lakehouse foundation provides a cost-effective and performance-optimized fabric for business intelligence, machine learning, and AI workloads at any scale. It is the foundation for migrating and modernizing existing analytics solutions, whether this be data appliances or traditional data warehouses.
This architecture may work with a few use cases limited to all data being centrally stored in OneLake. However, it is not a common scenario as users deal with various formats of data across various applications including SaaS (such as Salesforce, ServiceNow), on-premises and legacy applications, and data spread across multiple regions. The Denodo Platform extends use cases by providing a strong integrated framework across all sources of data in hybrid and multi-cloud environments.
A unified data access layer has always been critical to delivering business insights and driving business success. But next-generation AI applications will make it even more important for organizations to take full advantage of the data at their disposal, regardless of where it is stored and what form it takes. As LLMs and generative AI technology inevitably evolve, organizations will also require a data management foundation that is flexible and agile, allowing new data sources to be added quickly and new data views to be developed easily to support new emerging AI use cases. An adaptable data management layer also maximizes the ability to interchange off-the-shelf AI services as newer, better, and cheaper options are released.
Key strengths and features of the Denodo Platform
There are a variety of use case scenarios where the Denodo Platform adds value by augmenting and complementing the strengths of Microsoft Fabric architecture:
Customers who don’t want to migrate all their data into OneLake. This is a common and practical scenario with data management
Users looking for unified data security across all data repositories and environments, outside of OneLake
Flexible deployment options in a hybrid and multi-cloud environment, supporting data modeling and query optimization techniques to accelerate data integration and delivery in a performant manner
Ability to model data to business users and accelerate cloud adoption via transition of workloads to Azure at their own pace without impacting the data consumers
Benefits of integrated technologies
The Denodo Platform, leveraging data virtualization technology, minimizes the need for costly data movement or consolidation before augmenting an AI application. Here are a few common use cases and benefits delivered via the Denodo Platform and Microsoft Fabric:
Data self-service for data democratization in a hybrid and multi-cloud environment (examples: self-service reporting and KPI dashboards, data mesh/self-service data product development)
IT infrastructure modernization (fueling cloud workload transition from on-premises to cloud, legacy retirements, application consolidations, and data lake optimizations)
Data foundation for improved customer experience (driven via generative AI support for Customer 360, self-service portals, digital engagement applications, full-journey reporting, and analytics)
Improve operational efficiency, agility, resilience (examples: Data-as-a-Service/data marketplaces, and supply chain optimization)
Centralized governance, risk, compliance (examples: centralized data privacy/security for all sources beyond OneLake, financial regulatory reporting, sustainability/ESG reporting, anti-fraud/money laundering, and risk analytics)
The Denodo Platform provides a consolidated data foundation for AI applications to access integrated data and offers other key benefits, including:
A unified, secure access point for LLMs to interact with and query all enterprise data (ERP, operational data mart, EDW, application APIs)
A rich semantic layer, providing LLMs with the needed business context and knowledge (such as table descriptions, business definitions, categories/tags, and sample values)
Quick delivery of logical data views that are decoupled and abstracted from the underlying technical data views (which can be difficult to use by LLMs)
Delivery of LLM-friendly wide logical table views and built-in query optimization relieves LLMs from dealing with specific data source constraints or optimized join strategies
In summary, the Denodo Platform’s ability to manage and process widespread corporate data (structured and unstructured) alongside Microsoft Fabric support via OneLake creates a strong foundation for supporting generative AI applications. This enables real-time data access for chatbots needing data from various systems to deliver accurate and appropriate responses to customer prompts. The Denodo Platform and Microsoft Fabric deliver a unified data fabric for a strong and governed data foundation in a multi-cloud environment, thus accelerating retrieval augmented generative AI projects and amplifying generative AI applications to deliver business value across the enterprise.
You can experience all these capabilities via Denodo Enterprise Plus, available in the Microsoft Azure Marketplace.
Microsoft Tech Community – Latest Blogs –Read More
Hello authentication in apps always defaults to fingerprint
I have fingerprint, face and PIN Windows Hello authentication options set up. When logging in to Windows, Windows helpfully attempts to use facial recognition whilst at the same time activating the fingerprint sensor. This allows me to use either option to login (or revert back to PIN in case both fail).
However once logged in, some of my apps (such as password managers, browsers etc) also use Hello authentication. However, in these cases the Hello prompt seems to always immediately default to fingerprint, without attempting facial recognition. This includes when the laptop is in Tablet Mode, when fingerprint sensor is physically inaccessible. In all cases, to use facial recognition I have to click through the “more choices” menu in the dialog, which is a lot of extra clicks and defeats the purpose of facial recognition being relatively quick and straightforward.
Alternatively I can disable fingerprint entirely in settings. This forces Hello to attempt facial before reverting to PIN, but this is obviously not ideal as it leaves me without fingerprint login available across Windows.
Is there a setting or fix that makes Hello authentication for apps behave the same way as it helpfully does at the login screen, ie not default to fingerprint each time?
I have fingerprint, face and PIN Windows Hello authentication options set up. When logging in to Windows, Windows helpfully attempts to use facial recognition whilst at the same time activating the fingerprint sensor. This allows me to use either option to login (or revert back to PIN in case both fail). However once logged in, some of my apps (such as password managers, browsers etc) also use Hello authentication. However, in these cases the Hello prompt seems to always immediately default to fingerprint, without attempting facial recognition. This includes when the laptop is in Tablet Mode, when fingerprint sensor is physically inaccessible. In all cases, to use facial recognition I have to click through the “more choices” menu in the dialog, which is a lot of extra clicks and defeats the purpose of facial recognition being relatively quick and straightforward. Alternatively I can disable fingerprint entirely in settings. This forces Hello to attempt facial before reverting to PIN, but this is obviously not ideal as it leaves me without fingerprint login available across Windows. Is there a setting or fix that makes Hello authentication for apps behave the same way as it helpfully does at the login screen, ie not default to fingerprint each time? Read More
Copilot for Word
Hi There,
I’ve been playing around with Copilot for Microsoft 365 for quite sometime and I just have a quick question in regards to Copilot for Microsoft 365. Lets say I’m using Copilot for word and I ask Copilot to create a document for me and I review it and I click on regenerate because I’m not happy about the result that it created. So, my question is how many times can I ask Copilot to create new document for me? Is there a limit for that? For instance, I can click on it for 5 times and it will give me a different content everytime and beyond that it will return back to content which it created for me the first time. So, please let me know the limit here.
Hi There,
I’ve been playing around with Copilot for Microsoft 365 for quite sometime and I just have a quick question in regards to Copilot for Microsoft 365. Lets say I’m using Copilot for word and I ask Copilot to create a document for me and I review it and I click on regenerate because I’m not happy about the result that it created. So, my question is how many times can I ask Copilot to create new document for me? Is there a limit for that? For instance, I can click on it for 5 times and it will give me a different content everytime and beyond that it will return back to content which it created for me the first time. So, please let me know the limit here. Read More
How to fix View My Paycheck Not Working error in Q.B after update?
I’ve been experiencing issues with the “View My Paycheck” feature in Quick-Books. Employees can’t access their paychecks online, and I keep getting an error message. Has anyone else encountered this problem? Any solutions or tips would be greatly appreciated.
I’ve been experiencing issues with the “View My Paycheck” feature in Quick-Books. Employees can’t access their paychecks online, and I keep getting an error message. Has anyone else encountered this problem? Any solutions or tips would be greatly appreciated. Read More
How to Data Migration Services Online to Desktop in Q.B after update?
I need help migrating data from Quick-Books Online to Quick-Books Desktop. I’ve tried following various guides, but I’m facing errors and data inconsistencies. Has anyone successfully performed this migration? Any step-by-step advice or professional services recommendations would be greatly appreciated!
I need help migrating data from Quick-Books Online to Quick-Books Desktop. I’ve tried following various guides, but I’m facing errors and data inconsistencies. Has anyone successfully performed this migration? Any step-by-step advice or professional services recommendations would be greatly appreciated! Read More