Page 1 of 2

Real-time sync of large files.

Posted: Wed May 18, 2011 10:05 am
by gdu90
I am evaluating BestSync prior to purchasing licenses for a home office network. The PCs run WinXp/NTFS and we have two NAS (Linux/CIFS) for file storage and backups.

Our first main requirement is to backup local files from each PC (source) to NAS (target) every night. This seems to work fine and quickly as a one direction sync. Thanks.

The second main requirement is to keep volatile and important local files backed-up from some PCs to NAS during the working day. I have been trying to use real-time sync (one-direction) for this. I have encountered two problems.

1. When run as a service, the real-time task usually starts OK and processes any outstanding file synchronisations to the NAS, but it will later stop processing for no apparent reason. The task/service still shows as started in the BestSync main window and in services.msc, but it does not process any more file synchronisations. It cannot be cancelled from the BestSync main window and if it is stopped via services.msc, then it appears to write no log so I cannot see what it has processed and what not. When run by user, this does not happen. The service logon is correct (and the task logs-on and starts) and the NAS is up and has not hibernated the HDD.

2. When run by user or by service, the task has a problem with large open files - like Outlook .pst or Excel .xlsx with auto-save. I am being left with multiple instances of files at the target named _bestsync_tmp_* of type *.bs_ (large files) and *.bs_._slice_ (small files). Most of these seem to correspond with failure entries in the log. I am assuming the problem is with the VSS or with timing of file monitoring and subsequent copying by the task but I don't know what to change to solve it.

What can I try to fix these ? Thanks.

Re: Real-time sync of large files.

Posted: Fri May 20, 2011 11:47 am
by RiseFly
For the Question1, please check if you have setup the login account as the following page:
http://www.risefly.com/fseqna.htm#SyncNetDriveAsSvc
For the Question2, *.bs_ (large files) and *.bs_._slice_ (small files) files are remained because BestSync has failed to copy the files due to network disconnection etc. At the next synchronization time, Bestsync will try to resume copying from these broken files, and will delete these files when copy files successfully. Please do not worry about these files.

Re: Real-time sync of large files.

Posted: Fri May 20, 2011 5:20 pm
by gdu90
Thank you.

re. 1, I have modified the logon and will monitor to see whether this fixes the problem. I will post again if it does not.

re. 2, I had guessed that these files were supposed to be temporary and that they related to failed copies, as you explain. However, BestSync is NOT removing them even when a successful sync occurs after a previous failure. I have had to delete them all manually. Since the files are large and volatile, the total of these redundant temporary files is about 3GB per PC per day. This is not OK as it will rapidly fill our NAS. There must be some other reason they are not being deleted and I would assume it is related to whatever the reason is for the failures - the process is stopping unexpectedly and not cleaning up these files. Can you suggest something else to check ?

Re: Real-time sync of large files.

Posted: Sat May 21, 2011 3:03 pm
by RiseFly
You need not to delete these intermediate file manually, please just synchronize the task again. You will notice number of these files will not be increased.

Re: Real-time sync of large files.

Posted: Sat May 21, 2011 10:17 pm
by gdu90
Hi.

Unfortunately both problems continue:

re. 1, the change to the logon does not change the problem of the run-as-service task stopping. The service now has the logon of a NAS user with full R/W permissions and the task itself has (i) the same userid and (ii) the "impersonate" fields empty. But it still stops processing after some time without any log of error. If I re-start the service, it recognises more files have changed and syncs them, but it does not do this unless it is re-started. (However, with the change to logons, I CAN now stop this run-as-service task via BestSync main window, not just via services.msc, but still no log is written if I do this.)

re. 2, the number of temporary files IS increasing. BestSync does not remove any of them. Today on 1 PC I have found 2.3GB of new temporary files (dated today) but the run-as-user task is idle (it has status "monitoring file change" and there is no disk activity on PC or NAS). The log has no errors but there are temporary files left behind.

Is there any log file I can upload from either task ?

Re: Real-time sync of large files.

Posted: Sun May 22, 2011 1:21 am
by RiseFly
Because the NAS network drive is not very compatiable with the "File Change Monitoring" feature, please try to change the schedule to "Designate"/"Minutely" schedule. BestSync scans file in high speed, the "Minutely" schedule is more reliable than "Real-time" for NAS drive.
If you changed to "Minutely" schedule, I think the problem of temporary files will also be resolved.
We have just released the new version 6.2.10, please try it.

Re: Real-time sync of large files.

Posted: Sun May 22, 2011 5:27 pm
by gdu90
OK. Thanks.

I was already using version 6.2.10.

re. 2, I have changed the schedule from "real-time" to "minutely" and there are no more temporary files being left behind. But to use "minutely" instead of "real-time" I will need to operate a user logoff script on each PC to catch changes in the minutes between the last sync and logoff. I need to run only one BestSync task at logoff, and not re-run any of the other tasks (because the others are either real-time, or are overnight by service). I have read the FAQ (at http://www.risefly.com/fseqna_run_off.htm). So I think the script command I need is:

C:\Program Files\RiseFly\BestSync 2011\BestSyncSvc.exe /NoSvc /file:"C:\<filepath>\<filename>.fsf" /sync:(<task-number>)

where <task-number> is the same run-as-user task that is used to sync users' critical data by the "minutely" schedule above.

A) Is this usage of task number correct ? Or do I need to create a duplicate of this task that is run-as-service with a different number (but with no schedule itself) ?
B) Should there be any switch value inserted between "/sync:" and "(<task-number>)" ? The example script in the FAQ has none but the command line syntax shows S, M, or X .

Thanks for the assistance.

Re: Real-time sync of large files.

Posted: Sun May 22, 2011 11:35 pm
by RiseFly
Re A) You need not to create another task, just reuse the task in the user profile setting(<filename>.fsf).
Re B) The command line shoud b /Sync:S(<task-number>). Thank you point out this mistake, I have fixed the command line in the QnA page.
And we have just fixed a bug that may cause the temporary file remains, please try the new 6.2.12 at the download page. May be you can use the real-time sync, and no need the logoff script now ;) .

Re: Real-time sync of large files.

Posted: Mon May 23, 2011 10:13 am
by gdu90
re. 1, this was fixed after simultaneous re-boot of PC and NAS with the changes to logon you suggested.

Thank you for the responsive assistance. I will test the script and afterwards purchase licences.

Re: Real-time sync of large files.

Posted: Mon May 23, 2011 5:36 pm
by gdu90
re. the logoff script:

i - If the script uses the same task number that is normally run-as-user by schedule, will it ignore the schedule, run once and stop ? Or do I need a duplicate without a schedule ?

ii - What will happen if there is already another instance of BestSyncSvc.exe running (for the real-time sync tasks) when the user logs-off and this script is invoked (for the normally run-as-user task) ? Won't the sQuery always find a "hit" on the original real-time sync task and keep looping, so stopping log-off ?

If I add new lines of code to stop any pre-existing tasks first, will this halt the other instance properly (eg. if it is in middle of copying files the user just saved) and how long must the script wait for this halt before continuing to run the new task as per the original code ?

If the above is correct, could you post new code for the halt-and-wait please. Or a different solution if needed.

Thanks.

[PS. My Main Window (v.6.2.10) still says this is the newest version when using the on-line checking so I didn't get v.6.2.12 yet.]