Configuring archive period for tables at Mass for Data Retention within Log Analytics Workspace
How does this Blog help in Configuring archive period for tables at Mass for Data Retention in Log Analytics Workspace:
Simplified Data Archival: Implementing archival within Log Analytics Workspace provides a straightforward and integrated solution for retaining log data over extended periods. This ensures compliance with regulatory requirements, making it easier for organizations to meet data retention mandates without resorting to complex external storage solutions.
Efficient Data Management: The article’s primary focus on mass applying archival to multiple tables within Log Analytics Workspace streamlines the process of managing a diverse range of log data. This efficiency is invaluable for organizations dealing with large volumes of logs from various sources, simplifying the management of data retention policies and significantly reducing the administrative overhead.
Cost and Complexity Optimization: By leveraging Log Analytics Workspace for archival, organizations can maintain a balance between cost-effective storage and data accessibility. This approach eliminates the need for more complex and potentially costly alternatives like Blob Storage and Azure Data Explorer (ADX) for archival, thus reducing both operational complexity and storage expenses. It provides a practical solution for long-term data retention while optimizing both cost and management efforts.
Step 0: Default approach to perform archival at a table level in Log Analytics Workspace
Navigate to Log Analytics Workspace > Table > Manage Table
Consider replicating above for multiple tables using below PowerShell commands.
Step 1: Fetch the table list on which Archiving is Required using this KQL.
KQL to fetch the Active table list:
search *| distinct $table
Step 2: Export the KQL Result-Set:
Exporting the table list in CSV using export functionality
Step 3: Open it with Excel & Rename the column name to “Table” as:
Step 4: Rename from “$table” column to “Table” as:
Rename $table to Table as highlighted:
Step 5: Rename the Excel File name as well:
From “query_data” to “Sentinel” as shown
Step 6: Open Cloud Shell on Azure portal and upload this new file:
Upload the file from local machine as:
Step 7: Check the uploaded file using “ls” list command for uploaded File as:
Step 8: Run following PowerShell command in Cloud shell once file upload completes:
Import-CSV “SentinelTable.csv” | foreach {Update-AzOperationalInsightsTable -ResourceGroupName sentineltraining -WorkspaceName sentineltrainingworkspace -TableName $_.Table -TotalRetentionInDays 2556}
Prior Running the command ensure to update:
*Please update the –TotalRetentionInDays as required in your scenario
*Update the Resource Group Name, Log analytics Workspace name respectively.
Step 9: Check the Archival Log Analytics Table for following tables:
Step 10: The Tables exported have updated Archival period and others have default Retention as per Log Analytics Settings:
Navigation: Log Analytics Workspace > Settings > Tabels > Archive Period.
Conclusion:
1. This blog covers the default approach at a table level to perform archival for long term storage within log analytics workspace.
2.This blog covers steps to actually scale the archival for multiple tables which is a key production requirement.
3. All the steps can be implemented in a lab environment and archival period can be observed in log analytics workspace in table blade respectively.
Microsoft Tech Community – Latest Blogs –Read More