Slow and/or needless updating of remote database tables

2 posts / 0 new
Last post
Slow and/or needless updating of remote database tables

I use a central database for CQRLOG which must be accessed over the Internet (it's hosted on a VPS, which also runs cqrweblog). I use several different computers for logging at different locations. Some have very slow, high-latency Internet connections. This can cause trouble when launching the program.

Looking at the source, it seems that a date comparison is performed on a local file (, etc), which triggers a download of the file if needed and an automatic update of the table (dxcc_ref, iota_list etc) in the mysql database. This can be very, very slow for me - sometimes an hour. That's not so bad if the update is actually needed, but updating the remote database is usually needless, since only the local file is old - the mysql database is usually up to date, since many times it was already updated by another client.

Some possible ideas to speed up the launch process for users like myself:

1) Select all rows from the remote table into a local file and compare this file to the new *.tab file - only update the database if there are differences.

2) Create an entry in the config table in the database with the date that the data was last refreshed - only update when the local *.tab file is newer.

3) Create an entry in the config table in the database with the hash of the last *.tab file that was used to update the table. When a new local *.tab file is needed, compare its hash to the one stored in the table. Where there is a mismatch, update the table.

Also, when a table refresh is needed, consider the LOAD DATA INFILE or LOAD DATA LOCAL INFILE statement to speed up the process. I am more familiar with TSQL, but this seems similar to what microsoft calls a BULK IMPORT. Removing the need for multiple INSERT statements could make the process much faster.

Performance is fine with a slow connection except for this, and I think if this issue could be addressed it would help people in my situation. I can of course use cqrweblog to work around this, but I prefer it mostly for mobile use - there's nothing like the real thing!



P.S. This is my first post here so if I have mis-filed this request/suggestion and it would be better suited to another area, please forgive me.

Db tables update

Hi !
I have noticed same thing when using my servers DB at home via 3G/VPN with laptop from summer cottage.
Usually I do either use cqrweblog or say NO to database updates while I'm not at home.

One thing, that I have not tested (I must do it some day in future) is to run replicated DB system via VPN.

At the moment I have that 2 direction replication at home network where ham laptop has it's own database.
The "main" database that is used when working.
How ever it replicates on the fly to my server that has "secondary" database that can be accessed via internet. Then it is the "main" and qsos will replicate to ham laptop when it is next started.

It works very well. Sometimes replication order gets corrupted but it is easy to fix with few scripts as long as you know
what database has the latest qsos. But it is easy as I'm the only user.

Replication should go as a background process via VPN by it self. Then using local db should work with normal speed.
Have to try...

Another untested feature is to use setting "preferences/program/configuration storage settings" where you can tell what things are saved locally to every PC and what are common. I do not know effects it to dxcc tables etc. I must be checked.

These are interesting, but more "advanced" problems.