Category: News
Dealing with Intuit Data Protect not working because of Firewall or Connection Issues?
Troubleshooting Solutions: Intuit Data Protect not working because of Firewall or Connection Issues
Intuit Data Protect (IDP) is a critical tool for safeguarding your QuickBooks company files. However, issues with firewalls or connectivity can disrupt its functionality. Here’s a comprehensive guide to resolving these problems.
Firewall Configuration:Check Firewall Settings: Ensure that your firewall isn’t blocking Intuit Data Protect. Navigate to your firewall settings and add exceptions for IDP to allow incoming and outgoing connections.Configure Port Settings: IDP requires specific ports to be open for communication. Configure your firewall to allow traffic through ports 80, 443, 56728, and 55368.Third-Party Firewalls: If you’re using third-party security software, such as Norton or McAfee, adjust their settings to permit IDP traffic.Internet Connectivity:Network Connection: Verify that your computer has a stable internet connection. Weak or intermittent connections can disrupt IDP backups.Router Settings: Check your router’s configuration to ensure it’s not blocking IDP traffic. Port forwarding may be necessary for seamless communication.VPN Considerations: If you’re using a Virtual Private Network (VPN), confirm that it doesn’t interfere with IDP operations. Temporarily disable the VPN or configure it to allow IDP traffic.Software Updates:QuickBooks Updates: Ensure that your QuickBooks software is up to date. Sometimes, compatibility issues arise between QuickBooks updates and IDP versions.IDP Updates: Similarly, make sure that your Intuit Data Protect software is updated to the latest version. Newer releases often include bug fixes and compatibility improvements.Automatic Updates: Enable automatic updates for both QuickBooks and IDP to stay current with the latest enhancements and fixes.Antivirus Software:Scan Exclusions: Add IDP-related folders and processes to your antivirus software’s exclusion list. Scanning these files during IDP operations can lead to performance degradation.Real-Time Protection: Configure your antivirus software to avoid interfering with IDP processes. Disable real-time scanning temporarily to determine if it’s causing the issue.Whitelist IDP: If your antivirus offers a whitelist feature, add Intuit Data Protect to ensure it operates without interruptions.System Configuration:User Permissions: Ensure that you have sufficient permissions to run Intuit Data Protect. Administrative privileges may be required for certain operations.Resource Allocation: Check your system’s resource usage during IDP backups. Insufficient system resources, such as disk space or RAM, can impede IDP’s performance.Background Processes: Terminate unnecessary background processes that might compete for bandwidth or system resources during IDP backups.Technical Support:Intuit Support: If you’ve exhausted all troubleshooting steps without success, contact Intuit Support for assistance. They can provide personalized guidance and address specific issues related to your setup.Community Forums: Explore Intuit’s community forums to see if other users have encountered similar issues and found solutions. Peer-to-peer support can be invaluable in resolving complex problems.Professional Help: Consider consulting with a qualified IT professional if you’re unable to resolve the issue on your own. They can perform a comprehensive analysis of your system and network infrastructure to identify and rectify any underlying issues.
By following these troubleshooting steps, you can effectively address firewall or connection issues affecting Intuit Data Protect. Remember to document any changes made to your system configuration for future reference.
Troubleshooting Solutions: Intuit Data Protect not working because of Firewall or Connection Issues Intuit Data Protect (IDP) is a critical tool for safeguarding your QuickBooks company files. However, issues with firewalls or connectivity can disrupt its functionality. Here’s a comprehensive guide to resolving these problems. Firewall Configuration:Check Firewall Settings: Ensure that your firewall isn’t blocking Intuit Data Protect. Navigate to your firewall settings and add exceptions for IDP to allow incoming and outgoing connections.Configure Port Settings: IDP requires specific ports to be open for communication. Configure your firewall to allow traffic through ports 80, 443, 56728, and 55368.Third-Party Firewalls: If you’re using third-party security software, such as Norton or McAfee, adjust their settings to permit IDP traffic.Internet Connectivity:Network Connection: Verify that your computer has a stable internet connection. Weak or intermittent connections can disrupt IDP backups.Router Settings: Check your router’s configuration to ensure it’s not blocking IDP traffic. Port forwarding may be necessary for seamless communication.VPN Considerations: If you’re using a Virtual Private Network (VPN), confirm that it doesn’t interfere with IDP operations. Temporarily disable the VPN or configure it to allow IDP traffic.Software Updates:QuickBooks Updates: Ensure that your QuickBooks software is up to date. Sometimes, compatibility issues arise between QuickBooks updates and IDP versions.IDP Updates: Similarly, make sure that your Intuit Data Protect software is updated to the latest version. Newer releases often include bug fixes and compatibility improvements.Automatic Updates: Enable automatic updates for both QuickBooks and IDP to stay current with the latest enhancements and fixes.Antivirus Software:Scan Exclusions: Add IDP-related folders and processes to your antivirus software’s exclusion list. Scanning these files during IDP operations can lead to performance degradation.Real-Time Protection: Configure your antivirus software to avoid interfering with IDP processes. Disable real-time scanning temporarily to determine if it’s causing the issue.Whitelist IDP: If your antivirus offers a whitelist feature, add Intuit Data Protect to ensure it operates without interruptions.System Configuration:User Permissions: Ensure that you have sufficient permissions to run Intuit Data Protect. Administrative privileges may be required for certain operations.Resource Allocation: Check your system’s resource usage during IDP backups. Insufficient system resources, such as disk space or RAM, can impede IDP’s performance.Background Processes: Terminate unnecessary background processes that might compete for bandwidth or system resources during IDP backups.Technical Support:Intuit Support: If you’ve exhausted all troubleshooting steps without success, contact Intuit Support for assistance. They can provide personalized guidance and address specific issues related to your setup.Community Forums: Explore Intuit’s community forums to see if other users have encountered similar issues and found solutions. Peer-to-peer support can be invaluable in resolving complex problems.Professional Help: Consider consulting with a qualified IT professional if you’re unable to resolve the issue on your own. They can perform a comprehensive analysis of your system and network infrastructure to identify and rectify any underlying issues.By following these troubleshooting steps, you can effectively address firewall or connection issues affecting Intuit Data Protect. Remember to document any changes made to your system configuration for future reference. Read More
Scale Real-Time Streams to Delta Lakehouse with Apache Flink on Azure HDInsight on AKS
This post is co-authored with Keshav Singh, Principal Engineering Lead, Microsoft Purview Data Governance
In this blog, we turn the page and learn about enabling delta format as source and sink for stream processing with Apache Flink. Delta has become a DeFacto ACID compliant Lakehouse format for ecosystem enabling Petabyte scale processing while turning it a single source of truth, it becomes essential to bring it all together on top of Microsoft Fabric. Data engineering in delta format unifies diverse data sources into singular mode for analytics. Lastly as technologies such as Fabric endpoint, Synapse Serverless SQL will get efficient by the day, direct mode delta access will get cheaper and faster with no real need for an edge copy analytics.
Streaming Events can now be unified in Delta format as a sink for enabling Realtime analytics.
Let us consider a Sales Event Scenario, the event has demonstrated structure.
The Sales Source Event is stored in Delta Format on ADLS Gen2.
HDInsight on AKS Cluster Pool
Create a cluster pool to host a set of clusters, these could be Spark, Trino, Flink clusters. With a cluster pool as a concept, and a Platform as a service offering, HDInsight on AKS allows developers to quickly build up a data estate with all their favorite open source workloads, with full configurability and SKU sizing of their choice.
Let’s Provision the Pool.
Next, provision a Flink Cluster, we went with a Session cluster.
In nutshell a session cluster can share resources amongst multiple jobs while an Application Cluster will be resource dedicated towards a particular application.
Once the cluster is provisioned update the flink-configs to add/load the Hadoop class path and ensure to load the cluster’s native class loaders.
Upon applying the changes the cluster will restart, click on the Flink Dashboard and review its available. This is one point for DAG, execution logs and stream processing details.
Application Code
Here is our code for SteamProcessingJob.
This code simply reads the data from a Delta source and stream processes it to Delta Sinks.
package org.example;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import io.delta.flink.sink.DeltaSink;
import io.delta.flink.source.DeltaSource;
import org.apache.flink.api.common.eventtime.WatermarkStrategy;
import org.apache.flink.core.fs.Path;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.table.api.DataTypes;
import org.apache.flink.table.data.RowData;
import org.apache.flink.table.types.logical.RowType;
import org.apache.flink.table.types.logical.TimestampType;
import org.apache.flink.table.types.logical.VarCharType;
import org.apache.flink.table.types.logical.IntType;
import org.apache.hadoop.conf.Configuration;
import java.util.ArrayList;
import java.util.Arrays;
public class StreamProcessingJob {
public static final RowType ROW_TYPE = new RowType(Arrays.asList(
new RowType.RowField(“SalesId”, new VarCharType(VarCharType.MAX_LENGTH)),
new RowType.RowField(“ProductName”, new VarCharType(VarCharType.MAX_LENGTH)),
new RowType.RowField(“SalesDateTime”, new TimestampType()),
new RowType.RowField(“SalesAmount”, new IntType()),
new RowType.RowField(“EventProcessingTime”, new TimestampType())
));
public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.enableCheckpointing(10000);
// Define the sink Delta table path
String deltaTablePath_sink = “abfss://flink@<storage>.dfs.core.windows.net/Streams/SaleSink”;
// Define the source Delta table path
String deltaTablePath_source = “abfss://flink@<storage>.dfs.core.windows.net/Streams/SaleSource”;
// Create a bounded Delta source for all columns
DataStream<RowData> deltaStream = createBoundedDeltaSourceAllColumns(env, deltaTablePath_source);
createDeltaSink(deltaStream, deltaTablePath_sink, ROW_TYPE);
// Execute the Flink job
env.execute(“FlinkDeltaSourceSinkExample”);
}
public static DataStream<RowData> createBoundedDeltaSourceAllColumns(
StreamExecutionEnvironment env,
String deltaTablePath) {
DeltaSource<RowData> deltaSource = DeltaSource
.forBoundedRowData(
new Path(deltaTablePath),
new Configuration())
.build();
return env.fromSource(deltaSource, WatermarkStrategy.noWatermarks(), “deltaSource”);
}
public static DataStream<RowData> createDeltaSink(
DataStream<RowData> stream,
String deltaTablePath,
RowType rowType) {
DeltaSink<RowData> deltaSink = DeltaSink
.forRowData(
new Path(deltaTablePath),
new Configuration(),
rowType)
.build();
stream.sinkTo(deltaSink);
return stream;
}
}
Here is the POM.xml for the JAVA project
<?xml version=”1.0″ encoding=”UTF-8″?>
<project xmlns=”http://maven.apache.org/POM/4.0.0″
xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance”
xsi:schemaLocation=”http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd”>
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>deltaflinkproject</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<flink.version>1.17.0</flink.version>
<java.version>1.8</java.version>
<scala.binary.version>2.12</scala.binary.version>
<hadoop-version>3.4.0</hadoop-version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
</dependency>
<!– https://mvnrepository.com/artifact/org.apache.flink/flink-streaming-java –>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java</artifactId>
<version>${flink.version}</version>
</dependency>
<!– https://mvnrepository.com/artifact/org.apache.flink/flink-clients –>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>io.delta</groupId>
<artifactId>delta-standalone_2.12</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>io.delta</groupId>
<artifactId>delta-flink</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-parquet</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients</artifactId>
<version>${flink.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop-version}</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-runtime</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<appendAssemblyId>false</appendAssemblyId>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
We build a JAR and upload to a convenient directory on ADLS Gen2.
At this point we are ready to submit the StreamProcessingJob on the Flink Cluster. Point to the jar location on the storage and provide the entry class details.
The Job processed the streams based on the logic defined in the JAVA jar code bits.
Callouts and Validations
Flink periodically commits into delta based on the configured checkpointing. env.enableCheckpointing(10000); In our case we issuing commits every 10 seconds.
NOTE : Now one of the critical observations to keep in mind for the initial dataset, incase your JobDuration < Checkpoint duration meaning, you have only 10 records and the job completes much before the first Checkpoint, you will observe only parquet files with no _delta_log directory since the first delta commit was never issued. This is an edge case worth calling out to remember the streaming the (unbounded )semantics are much different than (bounded) batch processing semantics.
The below screenshots depict a periodic processing of this data —
The initial run creates the parquet files but is yet to issue the first delta commit hence we observe only parquet files.
Upon the first commit the delta is initialized and continues to process the streams based on the checkpoint duration.
Upon completion and checkpoints all in-progress files are committed.
Checkpointing is a crucial feature in distributed stream processing frameworks like Flink to ensure fault tolerance and exactly-once semantics.
Relevance ofenv.enableCheckpointing(10000):
Ensures Fault Tolerance: Checkpointing allows Flink to take snapshots of the state of the streaming application at regular intervals. In case of failures, Flink can use these snapshots to restore the state of the application and continue processing from the last successful checkpoint. This ensures fault tolerance and resilience against failures such as machine crashes, network issues, or software bugs.
Consistent State: Checkpointing helps in maintaining consistent state in the face of failures. By periodically saving the state of the application, Flink guarantees that even if failures occur, the state can be recovered to a consistent point.
Exactly-once Processing: Checkpointing, combined with Flink’s processing model, enables exactly-once semantics. With exactly-once processing, each record in the input streams is processed exactly once, even in the presence of failures and restarts. This is crucial for applications where data correctness is paramount, such as financial transactions or real-time analytics.
Performance Considerations: The checkpointing interval (in this case, 10000 milliseconds or 10 seconds) is a trade-off between fault tolerance and performance. Shorter intervals provide better fault tolerance but can impact performance due to the overhead of taking and managing checkpoints. Longer intervals reduce this overhead but increase the potential amount of data loss in case of failures. Choosing an appropriate interval depends on the specific requirements of the application.
Configuration Flexibility: Flink provides flexibility in configuring checkpointing behavior. Developers can tune various parameters such as checkpointing interval, checkpointing mode (e.g., exactly-once, at-least-once), state backend, and storage options based on the specific needs of their application and the underlying infrastructure.
Finally lets validate the delta sink for our processed streams.
Architectural Considerations
In contrast with other streaming offerings,
Flink offers built-in support for managing complex stateful computations efficiently. It provides a unified runtime for both batch and stream processing, allowing seamless integration of stateful operations into streaming jobs. Flink’s state management capabilities include fault tolerance, exactly-once semantics, and flexible state backend options (e.g., memory, RocksDB).
Flink provides strong consistency guarantees with exactly-once processing semantics out of the box. It ensures that each event is processed exactly once, even in the presence of failures or restarts, making it suitable for mission-critical applications.
Flink is designed for high throughput and low-latency processing at scale. It supports fine-grained control over resource allocation and dynamic scaling, allowing efficient utilization of cluster resources. Flink’s pipelined execution model and advanced optimizations contribute to its superior performance.
Flink integrates seamlessly with other components of the Apache ecosystem, such as Apache Kafka, Apache Hadoop, and Apache Hive. It also provides connectors for integrating with various cloud platforms and data sources.
Flink excels in handling complex stateful workloads with its advanced state management, processing guarantees, scalability, and performance optimizations.
Conclusion
We have introduced Azure HDI on AKS (Flink) and emphasized the Delta Lakehouse story.
This blog is for those passionate data engineers who are data native and love to design a system, resilient/frugal/precise, write those hard lines of code, control their destiny and are curious to understand what lies under the hood. We dedicate this blog to all such, and extend you a warm welcome to a fully managed Apache Flink on Azure, with Azure HDInsight on AKS!
Get started today – Microsoft Azure
Read our documentation – What is Apache Flink® in Azure HDInsight on AKS? (Preview) – Azure HDInsight on AKS | Microsoft Learn
Questions? Please reach out to us on aka.ms/askhdinsight
Microsoft Tech Community – Latest Blogs –Read More
Where do I get the Data Set of EMG Signals with Standard features
I need to make a classification of EMG signals throgh features analysis.So that I need Standard featutres of EMG Signals.So that with the Standatd data ,i can compare the diseases DataI need to make a classification of EMG signals throgh features analysis.So that I need Standard featutres of EMG Signals.So that with the Standatd data ,i can compare the diseases Data I need to make a classification of EMG signals throgh features analysis.So that I need Standard featutres of EMG Signals.So that with the Standatd data ,i can compare the diseases Data emg stamdatrd data MATLAB Answers — New Questions
Issue with slanted y-axis ticks when using UIAxes component in app designer.
Hi,
I’m attempting to create a MATLAB app that displays a 2D plot but I’m intermittently having issues with the Y-axis tick when inserting a UIAxes plot.
It only seems to happen for certain combinations of x and y data. My guess is it has something to do with how the UIAxes supports all 3 axis. For some reason there isn’t an option to lock the UIAxes into a 2d format what I can tell.
I’ve tried programmatically resetting the viewing angle after plotting but that doesn’t seem to fix it.
Here is the code I’m using to plot:
plot(Ax,X,Y,’-k’);
view(Ax, 0, 90);
grid(Ax,’minor’);
Any help would be much appreciated.Hi,
I’m attempting to create a MATLAB app that displays a 2D plot but I’m intermittently having issues with the Y-axis tick when inserting a UIAxes plot.
It only seems to happen for certain combinations of x and y data. My guess is it has something to do with how the UIAxes supports all 3 axis. For some reason there isn’t an option to lock the UIAxes into a 2d format what I can tell.
I’ve tried programmatically resetting the viewing angle after plotting but that doesn’t seem to fix it.
Here is the code I’m using to plot:
plot(Ax,X,Y,’-k’);
view(Ax, 0, 90);
grid(Ax,’minor’);
Any help would be much appreciated. Hi,
I’m attempting to create a MATLAB app that displays a 2D plot but I’m intermittently having issues with the Y-axis tick when inserting a UIAxes plot.
It only seems to happen for certain combinations of x and y data. My guess is it has something to do with how the UIAxes supports all 3 axis. For some reason there isn’t an option to lock the UIAxes into a 2d format what I can tell.
I’ve tried programmatically resetting the viewing angle after plotting but that doesn’t seem to fix it.
Here is the code I’m using to plot:
plot(Ax,X,Y,’-k’);
view(Ax, 0, 90);
grid(Ax,’minor’);
Any help would be much appreciated. app designer, plotting MATLAB Answers — New Questions
How to Run raytrace or siteviewer related program in Parallel Computing Toolbox?
I am attempting to simulate coverage for different ‘txsites’ across various osm maps, aiming to expedite the process through parallelization. However, it appears that ‘siteviewer’ is not compatible with the Parallel Computing Toolbox.
1. I tried to create a ‘siteviewer’ object within a parallel loop:
pm = propagationModel("raytracing", …
"Method", "sbr", …
"MaxNumReflections", 5);
osmMaps = ["hongkong.osm","chicago.osm","manhattan.osm"];
lats = [22.27,41.87,40.71];
lons = [114.15,-87.63,-74.00];
parfor i=1:3
viewer = siteviewer(‘Buildings’, osmMaps(i), ‘Terrain’, "none", ‘Basemap’, "darkwater");
txs = txsite(Latitude=lats(i),Longitude=lons(i));
pd = coverage(txs, pm, ‘Map’, viewer);
disp(pd);
close(viewer);
end
The above code resulted in the following error: Error using siteviewer, Timeout waiting for response. Despite adjusting several initialization parameters, the error persisted. When I changed ‘parfor’ to ‘for’, the program ran without any errors. My understanding is that ‘siteviewer’ objects cannot be created within ‘parfor’.
2. Consequently, I altered my parallelization strategy by creating the ‘siteviewer’ object in advance and referencing it within the ‘parfor’ loop:
viewer = siteviewer(‘Buildings’, ‘hongkong.osm’, ‘Terrain’, "none", ‘Basemap’, "darkwater");
lats = [22.27,22.273,22.275];
lons = [114.15,114.152,114.154];
parfor i = 1:3
txs = txsite(Latitude=lats(i),Longitude=lons(i)); % Each iteration, ‘txs’ are at different locations.
pd = coverage(txs, pm, ‘Map’, viewer);
end
close(viewer);
Referencing the ‘viewer’ object within ‘parfor’ still led to errors.
Output: Warning: While loading an object of class ‘siteviewer’:
Variables of this type are not subscriptable using dot indexing.
3. Further,I examined the source code of the ‘coverage’ function and debugged the parallel code.
viewer = siteviewer(‘Buildings’, ‘hongkong’, ‘Terrain’, "none", ‘Basemap’, "darkwater");
future = parfeval(@()viewer, 1); % Assuming ‘viewer’ is already created.
[~, thisResult] = fetchNext([future]);
% Output: Warning: While loading an object of class ‘siteviewer’:
% Variables of this type are not subscriptable using dot indexing.
future = parfeval(@()siteviewer.all, 1); % Assuming existing ‘siteviewer’ objects.
[~, thisResult] = fetchNext([future]);
% Output: thisResult = 0x0 cell
future = parfeval(@()siteviewer.current, 1); % Assuming existing ‘siteviewer’ objects.
[~, thisResult] = fetchNext([future]);
%Output: error using siteviewer, Timeout waiting for response.
I am puzzled, as the computation for Ray Tracing is slow, cannot be expedited using GPU acceleration, nor can it utilize the Parallel Computing Toolbox, resulting in low efficiency. Is there any solution to run ‘coverage’ in parallel?
I have consulted many Q&As, including the following: answer_link
However, this Answer is evidently incorrect; the 1000 rays within a cell are identical, failing to achieve parallel acceleration, and moreover, the rays were computed incorrectly. Based on the issues I’ve encountered, ‘raytrace’ in a parallel loop is unable to access previously created ‘viewer’ objects, and the ‘Map’ in ‘raytrace(tx,rx)’ differs from the one I defined, leading to incorrect results.I am attempting to simulate coverage for different ‘txsites’ across various osm maps, aiming to expedite the process through parallelization. However, it appears that ‘siteviewer’ is not compatible with the Parallel Computing Toolbox.
1. I tried to create a ‘siteviewer’ object within a parallel loop:
pm = propagationModel("raytracing", …
"Method", "sbr", …
"MaxNumReflections", 5);
osmMaps = ["hongkong.osm","chicago.osm","manhattan.osm"];
lats = [22.27,41.87,40.71];
lons = [114.15,-87.63,-74.00];
parfor i=1:3
viewer = siteviewer(‘Buildings’, osmMaps(i), ‘Terrain’, "none", ‘Basemap’, "darkwater");
txs = txsite(Latitude=lats(i),Longitude=lons(i));
pd = coverage(txs, pm, ‘Map’, viewer);
disp(pd);
close(viewer);
end
The above code resulted in the following error: Error using siteviewer, Timeout waiting for response. Despite adjusting several initialization parameters, the error persisted. When I changed ‘parfor’ to ‘for’, the program ran without any errors. My understanding is that ‘siteviewer’ objects cannot be created within ‘parfor’.
2. Consequently, I altered my parallelization strategy by creating the ‘siteviewer’ object in advance and referencing it within the ‘parfor’ loop:
viewer = siteviewer(‘Buildings’, ‘hongkong.osm’, ‘Terrain’, "none", ‘Basemap’, "darkwater");
lats = [22.27,22.273,22.275];
lons = [114.15,114.152,114.154];
parfor i = 1:3
txs = txsite(Latitude=lats(i),Longitude=lons(i)); % Each iteration, ‘txs’ are at different locations.
pd = coverage(txs, pm, ‘Map’, viewer);
end
close(viewer);
Referencing the ‘viewer’ object within ‘parfor’ still led to errors.
Output: Warning: While loading an object of class ‘siteviewer’:
Variables of this type are not subscriptable using dot indexing.
3. Further,I examined the source code of the ‘coverage’ function and debugged the parallel code.
viewer = siteviewer(‘Buildings’, ‘hongkong’, ‘Terrain’, "none", ‘Basemap’, "darkwater");
future = parfeval(@()viewer, 1); % Assuming ‘viewer’ is already created.
[~, thisResult] = fetchNext([future]);
% Output: Warning: While loading an object of class ‘siteviewer’:
% Variables of this type are not subscriptable using dot indexing.
future = parfeval(@()siteviewer.all, 1); % Assuming existing ‘siteviewer’ objects.
[~, thisResult] = fetchNext([future]);
% Output: thisResult = 0x0 cell
future = parfeval(@()siteviewer.current, 1); % Assuming existing ‘siteviewer’ objects.
[~, thisResult] = fetchNext([future]);
%Output: error using siteviewer, Timeout waiting for response.
I am puzzled, as the computation for Ray Tracing is slow, cannot be expedited using GPU acceleration, nor can it utilize the Parallel Computing Toolbox, resulting in low efficiency. Is there any solution to run ‘coverage’ in parallel?
I have consulted many Q&As, including the following: answer_link
However, this Answer is evidently incorrect; the 1000 rays within a cell are identical, failing to achieve parallel acceleration, and moreover, the rays were computed incorrectly. Based on the issues I’ve encountered, ‘raytrace’ in a parallel loop is unable to access previously created ‘viewer’ objects, and the ‘Map’ in ‘raytrace(tx,rx)’ differs from the one I defined, leading to incorrect results. I am attempting to simulate coverage for different ‘txsites’ across various osm maps, aiming to expedite the process through parallelization. However, it appears that ‘siteviewer’ is not compatible with the Parallel Computing Toolbox.
1. I tried to create a ‘siteviewer’ object within a parallel loop:
pm = propagationModel("raytracing", …
"Method", "sbr", …
"MaxNumReflections", 5);
osmMaps = ["hongkong.osm","chicago.osm","manhattan.osm"];
lats = [22.27,41.87,40.71];
lons = [114.15,-87.63,-74.00];
parfor i=1:3
viewer = siteviewer(‘Buildings’, osmMaps(i), ‘Terrain’, "none", ‘Basemap’, "darkwater");
txs = txsite(Latitude=lats(i),Longitude=lons(i));
pd = coverage(txs, pm, ‘Map’, viewer);
disp(pd);
close(viewer);
end
The above code resulted in the following error: Error using siteviewer, Timeout waiting for response. Despite adjusting several initialization parameters, the error persisted. When I changed ‘parfor’ to ‘for’, the program ran without any errors. My understanding is that ‘siteviewer’ objects cannot be created within ‘parfor’.
2. Consequently, I altered my parallelization strategy by creating the ‘siteviewer’ object in advance and referencing it within the ‘parfor’ loop:
viewer = siteviewer(‘Buildings’, ‘hongkong.osm’, ‘Terrain’, "none", ‘Basemap’, "darkwater");
lats = [22.27,22.273,22.275];
lons = [114.15,114.152,114.154];
parfor i = 1:3
txs = txsite(Latitude=lats(i),Longitude=lons(i)); % Each iteration, ‘txs’ are at different locations.
pd = coverage(txs, pm, ‘Map’, viewer);
end
close(viewer);
Referencing the ‘viewer’ object within ‘parfor’ still led to errors.
Output: Warning: While loading an object of class ‘siteviewer’:
Variables of this type are not subscriptable using dot indexing.
3. Further,I examined the source code of the ‘coverage’ function and debugged the parallel code.
viewer = siteviewer(‘Buildings’, ‘hongkong’, ‘Terrain’, "none", ‘Basemap’, "darkwater");
future = parfeval(@()viewer, 1); % Assuming ‘viewer’ is already created.
[~, thisResult] = fetchNext([future]);
% Output: Warning: While loading an object of class ‘siteviewer’:
% Variables of this type are not subscriptable using dot indexing.
future = parfeval(@()siteviewer.all, 1); % Assuming existing ‘siteviewer’ objects.
[~, thisResult] = fetchNext([future]);
% Output: thisResult = 0x0 cell
future = parfeval(@()siteviewer.current, 1); % Assuming existing ‘siteviewer’ objects.
[~, thisResult] = fetchNext([future]);
%Output: error using siteviewer, Timeout waiting for response.
I am puzzled, as the computation for Ray Tracing is slow, cannot be expedited using GPU acceleration, nor can it utilize the Parallel Computing Toolbox, resulting in low efficiency. Is there any solution to run ‘coverage’ in parallel?
I have consulted many Q&As, including the following: answer_link
However, this Answer is evidently incorrect; the 1000 rays within a cell are identical, failing to achieve parallel acceleration, and moreover, the rays were computed incorrectly. Based on the issues I’ve encountered, ‘raytrace’ in a parallel loop is unable to access previously created ‘viewer’ objects, and the ‘Map’ in ‘raytrace(tx,rx)’ differs from the one I defined, leading to incorrect results. coverage, parallel computing, siteviewer, raytrace, propagation and channel models MATLAB Answers — New Questions
License Error while running Yolov4 object detector
I’m getting the following error message once I run the Yolov4 object detector. Also, the training data is not saved.
Error using vision.internal.cnn.validation.checkDetectionInputImage
License checkout failed.
License Manager Error -15
Unable to connect to the license server.
Check that the network license manager has been started, and that the client machine can communicate with the license server.
Troubleshoot this issue by visiting:
https://www.mathworks.com/support/lme/15
Diagnostic Information:
Feature: Video_and_Image_Blockset
License path:
Licensing error: -15,0.
Error in yolov4ObjectDetector/parseDetectInputs (line 649)
[sz,params.DetectionInputWasBatchOfImages] = vision.internal.cnn.validation.checkDetectionInputImage(…
Error in yolov4ObjectDetector/detect (line 440)
params = parseDetectInputs(detector,I,varargin{:});
How to resolve this issue.I’m getting the following error message once I run the Yolov4 object detector. Also, the training data is not saved.
Error using vision.internal.cnn.validation.checkDetectionInputImage
License checkout failed.
License Manager Error -15
Unable to connect to the license server.
Check that the network license manager has been started, and that the client machine can communicate with the license server.
Troubleshoot this issue by visiting:
https://www.mathworks.com/support/lme/15
Diagnostic Information:
Feature: Video_and_Image_Blockset
License path:
Licensing error: -15,0.
Error in yolov4ObjectDetector/parseDetectInputs (line 649)
[sz,params.DetectionInputWasBatchOfImages] = vision.internal.cnn.validation.checkDetectionInputImage(…
Error in yolov4ObjectDetector/detect (line 440)
params = parseDetectInputs(detector,I,varargin{:});
How to resolve this issue. I’m getting the following error message once I run the Yolov4 object detector. Also, the training data is not saved.
Error using vision.internal.cnn.validation.checkDetectionInputImage
License checkout failed.
License Manager Error -15
Unable to connect to the license server.
Check that the network license manager has been started, and that the client machine can communicate with the license server.
Troubleshoot this issue by visiting:
https://www.mathworks.com/support/lme/15
Diagnostic Information:
Feature: Video_and_Image_Blockset
License path:
Licensing error: -15,0.
Error in yolov4ObjectDetector/parseDetectInputs (line 649)
[sz,params.DetectionInputWasBatchOfImages] = vision.internal.cnn.validation.checkDetectionInputImage(…
Error in yolov4ObjectDetector/detect (line 440)
params = parseDetectInputs(detector,I,varargin{:});
How to resolve this issue. image processing, license, video and image blockset MATLAB Answers — New Questions
How changing the tenant in the new app teams for desktop
Hi,
on my machine I’ve the new app teams.
My username is assocatied to more tenants but in the new teams I don’t see any options to change the associated tenant.
Any helps to me to solve this issue, please? Thanks
Hi,on my machine I’ve the new app teams.My username is assocatied to more tenants but in the new teams I don’t see any options to change the associated tenant.Any helps to me to solve this issue, please? Thanks Read More
Demystifying QuickBooks Error H505: Causes, Solutions, and Prevention Strategies
QuickBooks is an essential tool for businesses to manage their finances efficiently. However, like any software, QuickBooks is susceptible to errors that can disrupt workflow and productivity. One such error is QuickBooks Error H505. In this blog post, we will explore the causes of Error H505, discuss effective solutions, and provide strategies to prevent its occurrence.
Understanding QuickBooks Error H505
QuickBooks Error H505 is a multi-user mode error that occurs when a user attempts to access a company file located on another computer. This error indicates that QuickBooks is unable to switch to multi-user mode and prevents users from accessing the company file remotely. It can be frustrating for businesses relying on multi-user mode to collaborate and work simultaneously on QuickBooks.
Causes of QuickBooks Error H505
Several factors can contribute to the occurrence of Error H505 in QuickBooks:
Incorrect hosting configuration: Error H505 may occur if QuickBooks hosting settings are not configured properly on the server or main computer where the company file is stored.
Firewall or security software blocking access: Firewall settings or security software on the computer hosting the company file may block access to QuickBooks, resulting in Error H505.
Incorrect DNS settings: Issues with Domain Name System (DNS) settings can also trigger Error H505, as QuickBooks relies on proper DNS configuration to locate company files.
Damaged or corrupted company file: If the company file being accessed is damaged or corrupted, it can lead to Error H505.
Solutions to QuickBooks Error H505
Resolving QuickBooks Error H505 requires identifying the underlying cause and implementing appropriate solutions:
Verify hosting settings: Ensure that hosting is turned on only for the server or main computer where the company file is stored. To verify hosting settings, go to the File menu in QuickBooks and select Utilities > Host Multi-User Access.
Check firewall settings: Configure firewall or security software to allow QuickBooks access to the necessary ports. Add exceptions for QuickBooks programs and files in your firewall settings to prevent them from being blocked.
Update and configure DNS settings: Verify and update DNS settings on all computers accessing QuickBooks to ensure proper connectivity. Consult with your network administrator or internet service provider if necessary.
Repair company file: If the company file is damaged or corrupted, use the QuickBooks File Doctor tool to repair the file. This tool can help identify and fix common issues with company files that may be causing Error H505.
Preventing QuickBooks Error H505
Preventing QuickBooks Error H505 from occurring in the future requires proactive measures:
Regularly update QuickBooks: Keep QuickBooks updated to the latest version to ensure compatibility and stability. Updates often include fixes for known issues and vulnerabilities.
Educate users: Train all users on proper QuickBooks setup and usage, including how to switch to multi-user mode and configure hosting settings correctly.
Monitor network configuration: Regularly review and update network settings, including DNS configuration, firewall settings, and router configurations, to ensure smooth connectivity for QuickBooks.
Backup company files: Implement a regular backup schedule for company files to prevent data loss in case of file corruption or damage.
Conclusion
QuickBooks Error H505 can disrupt workflow and hinder collaboration for businesses relying on multi-user mode. By understanding the causes of Error H505 and implementing the solutions and prevention strategies outlined in this blog post, businesses can effectively troubleshoot and prevent this error, ensuring seamless access to QuickBooks for their team. With proactive measures in place, businesses can minimize the impact of QuickBooks Error H505 and maintain uninterrupted productivity in their financial management processes.
QuickBooks is an essential tool for businesses to manage their finances efficiently. However, like any software, QuickBooks is susceptible to errors that can disrupt workflow and productivity. One such error is QuickBooks Error H505. In this blog post, we will explore the causes of Error H505, discuss effective solutions, and provide strategies to prevent its occurrence.Understanding QuickBooks Error H505QuickBooks Error H505 is a multi-user mode error that occurs when a user attempts to access a company file located on another computer. This error indicates that QuickBooks is unable to switch to multi-user mode and prevents users from accessing the company file remotely. It can be frustrating for businesses relying on multi-user mode to collaborate and work simultaneously on QuickBooks.Causes of QuickBooks Error H505Several factors can contribute to the occurrence of Error H505 in QuickBooks:Incorrect hosting configuration: Error H505 may occur if QuickBooks hosting settings are not configured properly on the server or main computer where the company file is stored.Firewall or security software blocking access: Firewall settings or security software on the computer hosting the company file may block access to QuickBooks, resulting in Error H505.Incorrect DNS settings: Issues with Domain Name System (DNS) settings can also trigger Error H505, as QuickBooks relies on proper DNS configuration to locate company files.Damaged or corrupted company file: If the company file being accessed is damaged or corrupted, it can lead to Error H505.Solutions to QuickBooks Error H505Resolving QuickBooks Error H505 requires identifying the underlying cause and implementing appropriate solutions:Verify hosting settings: Ensure that hosting is turned on only for the server or main computer where the company file is stored. To verify hosting settings, go to the File menu in QuickBooks and select Utilities > Host Multi-User Access.Check firewall settings: Configure firewall or security software to allow QuickBooks access to the necessary ports. Add exceptions for QuickBooks programs and files in your firewall settings to prevent them from being blocked.Update and configure DNS settings: Verify and update DNS settings on all computers accessing QuickBooks to ensure proper connectivity. Consult with your network administrator or internet service provider if necessary.Repair company file: If the company file is damaged or corrupted, use the QuickBooks File Doctor tool to repair the file. This tool can help identify and fix common issues with company files that may be causing Error H505.Preventing QuickBooks Error H505Preventing QuickBooks Error H505 from occurring in the future requires proactive measures:Regularly update QuickBooks: Keep QuickBooks updated to the latest version to ensure compatibility and stability. Updates often include fixes for known issues and vulnerabilities.Educate users: Train all users on proper QuickBooks setup and usage, including how to switch to multi-user mode and configure hosting settings correctly.Monitor network configuration: Regularly review and update network settings, including DNS configuration, firewall settings, and router configurations, to ensure smooth connectivity for QuickBooks.Backup company files: Implement a regular backup schedule for company files to prevent data loss in case of file corruption or damage.ConclusionQuickBooks Error H505 can disrupt workflow and hinder collaboration for businesses relying on multi-user mode. By understanding the causes of Error H505 and implementing the solutions and prevention strategies outlined in this blog post, businesses can effectively troubleshoot and prevent this error, ensuring seamless access to QuickBooks for their team. With proactive measures in place, businesses can minimize the impact of QuickBooks Error H505 and maintain uninterrupted productivity in their financial management processes. Read More
When will in-cell editing in the new Outlook become available & why is Copilot available in the new
Ever since M$ release the “in-cell editing” feature in Outlook (some 2 decades ago), I’ve used it for practically every email I respond to (I average about 50 responses per day). Now, Copilot tells me that if I want to leverage Copilot, it is not available in the classic Outlook and only in the “new” Outlook (Monarch). Copilot did reveal that M$ will stop supporting the classic Outlook in 2025. However, what is extremely concerning is that Copilot for M365 is not supported within the classic Outlook even though our enterprise is paying the full retail monthly subscription of $30/month for Copilot for M365 whereas the feature is not available for use in the most used “information management” tool in enterprises, namely, Outlook. Please add me to the millions of global voices who will decry this action, as M$ will be hamstringing many of its power-user clients. I post this as the CIO, who has led the migration of the Eastern Cape Provincial Government from the Enterprise Agreement Software Assurance) the subscription-based model of M365 (formerly known as Office 365). Read More
Block File Sharing to a Network Subnet
Hey – I have a use case to detect and block files being saved to storage devices / file shares on a subnet 192.168.0.0/16 (to prevent users connected over VPN copying data to their home LAN).
Is that possible using Microsoft Endpoint DLP or MDE?
thanks
Hey – I have a use case to detect and block files being saved to storage devices / file shares on a subnet 192.168.0.0/16 (to prevent users connected over VPN copying data to their home LAN). Is that possible using Microsoft Endpoint DLP or MDE? thanks Read More
Can Copilot extract data from a Sharepoint list (to which I have access to)?
In Loop, I asked Copilot to create a table with similar instructions: “Design a table and for each row in the SP list ‘A’ on the ‘Z’ SP site that has column ‘C’ empty and column ‘D’ with the value email address removed for privacy reasons add a row to the table with the following columns: column ‘f’ and ‘g’ with values as in the SP list ‘A’ and then the following empty columns ‘h’,’i’,’l’. Add 5 empty rows at the end”
It created the table with just one row and now values extracted from the sharepoint list to which I have access to. Is there a way I can instruct Copilot to do that?
In Loop, I asked Copilot to create a table with similar instructions: “Design a table and for each row in the SP list ‘A’ on the ‘Z’ SP site that has column ‘C’ empty and column ‘D’ with the value email address removed for privacy reasons add a row to the table with the following columns: column ‘f’ and ‘g’ with values as in the SP list ‘A’ and then the following empty columns ‘h’,’i’,’l’. Add 5 empty rows at the end” It created the table with just one row and now values extracted from the sharepoint list to which I have access to. Is there a way I can instruct Copilot to do that? Read More
SharePoint Online Deletion of Non-Empty Folders
A recent SharePoint Onlne update enables folder deletion when items are present in a folder. This is probably the way that things should have always worked. Even so, it’s good to have this capability because it helps site users clean out old and obsolete information, something that’s becoming increasingly important in the AI era for Microsoft 365.
https://office365itpros.com/2024/05/15/folder-deletion-sharepoint/
A recent SharePoint Onlne update enables folder deletion when items are present in a folder. This is probably the way that things should have always worked. Even so, it’s good to have this capability because it helps site users clean out old and obsolete information, something that’s becoming increasingly important in the AI era for Microsoft 365.
https://office365itpros.com/2024/05/15/folder-deletion-sharepoint/ Read More
Ways To Let ‘Never Combine taskbar buttons’. Windows 11
Troubleshooting Solutions for “Never Combine Taskbar Buttons” Issue in Windows 11
The taskbar in Windows 11 provides convenient access to open applications and windows, but some users may find it cumbersome when multiple windows of the same application are combined into a single button. If you’re facing this issue and prefer each window to have its own button on the taskbar, you can follow these troubleshooting solutions:
1. Check Taskbar Settings:
Start by accessing the taskbar settings in Windows 11. Right-click on an empty space on the taskbar and select “Taskbar settings” from the context menu. In the settings window, scroll down to the “Combine taskbar buttons” section.
2. Adjust Taskbar Button Settings:
Under the “Combine taskbar buttons” section, you’ll find several options for how taskbar buttons are displayed. By default, Windows 11 may be set to combine buttons when the taskbar is full. Change this setting to “Never” to ensure each window has its own button.
3. Customize Taskbar Behaviors:
Windows 11 offers customization options for the taskbar behaviors. Explore the “Taskbar behaviors” section in the taskbar settings to further refine how taskbar buttons behave. You can adjust settings related to taskbar buttons, such as whether they automatically hide in tablet mode or how notifications are displayed.
4. Use Registry Editor:
If the above solutions don’t resolve the issue, you can use the Registry Editor to tweak taskbar settings manually. Be cautious when making changes to the registry, as incorrect modifications can cause system instability. To access the Registry Editor, type “regedit” in the Windows search bar and press Enter. Navigate to the following registry key:
HKEY_CURRENT_USERSoftwareMicrosoftWindowsCurrentVersionExplorerAdvanced
Look for a DWORD value named “TaskbarGlomLevel.” Double-click on it and change its value to “0” to disable combining taskbar buttons.
5. Restart Windows Explorer:
After making changes to taskbar settings or the registry, it’s recommended to restart Windows Explorer to apply the changes. To do this, right-click on the taskbar and select “Task Manager.” In the Task Manager window, scroll down to the “Windows Processes” section, right-click on “Windows Explorer,” and select “Restart.”
6. Check for Updates:
Ensure that your Windows 11 operating system is up to date. Microsoft regularly releases updates that include bug fixes and performance improvements. Open the Settings app, go to “Windows Update,” and click on “Check for updates” to see if there are any available updates for your system.
By following these troubleshooting solutions, you should be able to disable the “Never Combine taskbar buttons” feature in Windows 11 and ensure each window has its own button on the taskbar. If the issue persists, consider reaching out to Microsoft support for further assistance.
Troubleshooting Solutions for “Never Combine Taskbar Buttons” Issue in Windows 11 The taskbar in Windows 11 provides convenient access to open applications and windows, but some users may find it cumbersome when multiple windows of the same application are combined into a single button. If you’re facing this issue and prefer each window to have its own button on the taskbar, you can follow these troubleshooting solutions:1. Check Taskbar Settings:Start by accessing the taskbar settings in Windows 11. Right-click on an empty space on the taskbar and select “Taskbar settings” from the context menu. In the settings window, scroll down to the “Combine taskbar buttons” section.2. Adjust Taskbar Button Settings:Under the “Combine taskbar buttons” section, you’ll find several options for how taskbar buttons are displayed. By default, Windows 11 may be set to combine buttons when the taskbar is full. Change this setting to “Never” to ensure each window has its own button.3. Customize Taskbar Behaviors:Windows 11 offers customization options for the taskbar behaviors. Explore the “Taskbar behaviors” section in the taskbar settings to further refine how taskbar buttons behave. You can adjust settings related to taskbar buttons, such as whether they automatically hide in tablet mode or how notifications are displayed.4. Use Registry Editor:If the above solutions don’t resolve the issue, you can use the Registry Editor to tweak taskbar settings manually. Be cautious when making changes to the registry, as incorrect modifications can cause system instability. To access the Registry Editor, type “regedit” in the Windows search bar and press Enter. Navigate to the following registry key:HKEY_CURRENT_USERSoftwareMicrosoftWindowsCurrentVersionExplorerAdvancedLook for a DWORD value named “TaskbarGlomLevel.” Double-click on it and change its value to “0” to disable combining taskbar buttons.5. Restart Windows Explorer:After making changes to taskbar settings or the registry, it’s recommended to restart Windows Explorer to apply the changes. To do this, right-click on the taskbar and select “Task Manager.” In the Task Manager window, scroll down to the “Windows Processes” section, right-click on “Windows Explorer,” and select “Restart.”6. Check for Updates:Ensure that your Windows 11 operating system is up to date. Microsoft regularly releases updates that include bug fixes and performance improvements. Open the Settings app, go to “Windows Update,” and click on “Check for updates” to see if there are any available updates for your system. By following these troubleshooting solutions, you should be able to disable the “Never Combine taskbar buttons” feature in Windows 11 and ensure each window has its own button on the taskbar. If the issue persists, consider reaching out to Microsoft support for further assistance. Read More
Teams sound no longer working after recent update for MacBook Pro
I have used Teams for years. There was an update last week and ever since when in a Teams meeting I can’t hear other people but they can hear me. I have tried with and without headphones and the thing that really stumps me is if I make a direct call through teams to an individual it is fine. I can hear them and they can hear me. I am using a MacBook Pro – Apple M1 Max – Sonoma most recent version.
Can anyone help ?
I have used Teams for years. There was an update last week and ever since when in a Teams meeting I can’t hear other people but they can hear me. I have tried with and without headphones and the thing that really stumps me is if I make a direct call through teams to an individual it is fine. I can hear them and they can hear me. I am using a MacBook Pro – Apple M1 Max – Sonoma most recent version.Can anyone help ? Read More
SQL Server Always On Availability group on AKS with DH2i’s DxOperator and Rancher by SUSE
Did you know you can do a quickfire deployment of SQL Server Always On Availability Groups on Azure Kubernetes Service using DH2i’s DxOperator and Rancher by SUSE within few mins ?
You might be already aware of deploying SQL Server Always On Availability Groups for SQL Server containers on Kubernetes using DH2i’s DxOperator, Refer to Always on Availability Groups for SQL Server containers on Kubernetes – The DH2i’s DxOperator way!! – Microsoft Community Hub for more details.
Utilizing Rancher with Azure Kubernetes Service (AKS) offers several benefits, particularly in terms of security, management, and monitoring capabilities.
Rancher enhances AKS security by providing tools for hardening, governance, and integrated logging and monitoring. It also includes a built-in service mesh and has recently added CIS Scanning to assess RKE clusters against the CIS Benchmark for Kubernetes.
It simplifies cluster management by offering features such as cluster provisioning, centralized security management, and a global catalog for multi-cluster applications. It allows for infrastructure as code using Terraform, enabling consistent deployment and self-service for development teams. For monitoring SQL Server using Rancher, you can leverage tools like Prometheus and Grafana, which are part of Rancher’s application catalog. These tools provide observability and help in monitoring application performance
Recently I had the opportunity of working together on a joint solution with our partners DH2i and SUSE. Here are the steps to perform this simple and effective solution.
Before we get started, there are a few pre-requisites.
A Kubernetes cluster managed by Rancher/Rancher prime by SUSE to deploy SQL Server instances; In this demo I use Azure Kubernetes Service (AKS) cluster.
Client machine to run kubectl commands, manage object creation and administration on Kubernetes cluster. I’m using a windows machine and here are the instructions to setup kubectl to connect and manage AKS cluster.
Valid DxEnterprise License with Availability group management features. ( you can get your license through Trial – DH2I
On the same client machine, I also have the SSMS (SQL Server Management Studio) or Azure Data Studio (ADS) installed to connect to SQL Server instances and view availability groups.
In order to setup Rancher for AKS, Follow the steps mentioned in Installing Rancher on Azure Kubernetes Service | Rancher.
This demo describes the following steps required to perform this deployment, viz.
Installing DxOperator chart
Configuring secrets
Installing DxOperator – DxE + SQL Server AG deployment
Connection & Validation through SSMS
Installing the DxOperator chart from Rancher portal
Open the Rancher WebUI using hostname.
Click on left hand side menu, Select the managed cluster into which you want to install DxOperator.
Click on Apps -> Charts.
Search for DxOperator, click on Install, Click on Next and then Install.
Creating the secrets
From left hand menu, Click on Storage -> Secrets.
Select Create and then Click on Opaque option.
Enter secret name dxe
Enter the key name as DX_PASSKEY wth value of DxEnterprise cluster passkey.
Select Add, then enter new key name as DX_LICENSE with value of your DxEnterprise license. A Developer key can be obtained at https://dh2i.com/trial/ .
Select Create in the bottom right corner.
Select Create at top right corner for new mssql secret.
Select Opaque, Name the secret as mssql.
Enter key name as MSSQL_SA_PASSWORD with value of strong SA password. Please note that your password must meet SQL Server password requirements, or the container deployment will fail. See the SQL Server Password Policy for more details.
Select Create in the bottom-right corner.
In the left-hand menu, select Apps > Charts.
In the list of available charts, select DxOperator – DxE + SQL Server AG.
Install the chart.
Select Install in the top-right corner.
Select Customize Helm options before install checkbox.
Select Next in the bottom-right.
Select Create Load Balancers checkbox to allow external access.
Select DxEnterprise edit options.
Select Accept EULA checkbox.
Select dxe for Cluster Secret.
Select SQL Server edit options.
Select Accept EULA checkbox.
Select mssql for SQL Secret.
Select Next in the bottom-right.
Select Install in the bottom-right.
Connecting and Validating through SSMS
From Rancher webUI, Go to Services
Click on dxenterprisesqlag-0-lb, Fetch the external load balancer IP.
Use that IP address to connect using SSMS along with SQL Server SA password
From object explorer, Click on Always on High Availability -> Availability Groups, Right click on AG1 and then show dashboard.
Note:
In this below screenshot, you can see that I have created a CONTAINED AG, this is done by mentioning “CONTAINED” in Availability group options (Step #3 in customizing Helm options before install). You may customize as per your SQL AG requirements.
That’s it, You have a SQL Server Always On Availability groups configured on Rancher managed AKS with DxOperator within few mins !! Go ahead and try this !
References:
Installing Rancher on Azure Kubernetes Service | Rancher
Quickstart: Create a public DNS zone and record – Azure portal – Azure DNS | Microsoft Learn
SQL Server HA Clustering & Software-Defined Perimeter | DH2i
Microsoft Tech Community – Latest Blogs –Read More
I want to change the default UAV dynamics to the quadcopter to F450.
Good Day!
I want to change the model of the quadcopter to F450. I have changed the Ix, Iy,Iz and mass of the default UAV dynamic model used in https://ww2.mathworks.cn/help/uav/px4/ref/hitl-simulink-plant-example.html. But this is not working for me. Could you please guide, how should i completely change the model dynamics for F450.Good Day!
I want to change the model of the quadcopter to F450. I have changed the Ix, Iy,Iz and mass of the default UAV dynamic model used in https://ww2.mathworks.cn/help/uav/px4/ref/hitl-simulink-plant-example.html. But this is not working for me. Could you please guide, how should i completely change the model dynamics for F450. Good Day!
I want to change the model of the quadcopter to F450. I have changed the Ix, Iy,Iz and mass of the default UAV dynamic model used in https://ww2.mathworks.cn/help/uav/px4/ref/hitl-simulink-plant-example.html. But this is not working for me. Could you please guide, how should i completely change the model dynamics for F450. uav dynamics, f450 MATLAB Answers — New Questions
I need a link and password to the MATLAB software accompanying the text Business Finance and Economics with MATLAB
request for software linkrequest for software link request for software link business, economics MATLAB Answers — New Questions
How to use YOLOv7 ONNX model on MATLAB2023b ?
Hello. I am doing a study on strawberry detection using MATLAB.
I am considering using YOLOv7 for object detection, but MATLAB only supports up to yolov4.
Therefore, I would like to convert the best.pt trained with yolov7 to ONNX format and use it in MATLAB using the " importNetworkFromONNX" function,
is this possible?
Please reply.Hello. I am doing a study on strawberry detection using MATLAB.
I am considering using YOLOv7 for object detection, but MATLAB only supports up to yolov4.
Therefore, I would like to convert the best.pt trained with yolov7 to ONNX format and use it in MATLAB using the " importNetworkFromONNX" function,
is this possible?
Please reply. Hello. I am doing a study on strawberry detection using MATLAB.
I am considering using YOLOv7 for object detection, but MATLAB only supports up to yolov4.
Therefore, I would like to convert the best.pt trained with yolov7 to ONNX format and use it in MATLAB using the " importNetworkFromONNX" function,
is this possible?
Please reply. yolov7 MATLAB Answers — New Questions
Do not receive reward points if Bing is set as the default search engine
I have set Bing as the default search engine in the address bar, when searching, the search link will be: https://www.bing.com/search?q=%s (%s is where the query would go).
The search went as expected. However, I don’t get reward points for searching.
To earn bonus points, I must visit the Bing homepage, then enter the keyword I want to search for and click Search. This is too time consuming and pointless.
Please fix this problem.
I have set Bing as the default search engine in the address bar, when searching, the search link will be: https://www.bing.com/search?q=%s (%s is where the query would go).The search went as expected. However, I don’t get reward points for searching.To earn bonus points, I must visit the Bing homepage, then enter the keyword I want to search for and click Search. This is too time consuming and pointless.Please fix this problem. Read More