Jump to content
kahuja

validate file integrity/Optimize

Recommended Posts

Kristina,

 

Help says:

 

Optimize

 

Optimizing your projects on a regular basis will help to reduce the size of your files by eliminating records no longer needed.

[snap]

The current project will be optimized. Deleted records, including people, events, sources, repositories; unused media files in Slideshow; as well as duplicate place records, will be eliminated and the space reclaimed.

 

Validate File Integrity (VFI)

 

During the course of working with your projects, many things may occur that could have an adverse effect on your files. You may encounter power failures, computer glitches, etc. TMG gives you a way to check the integrity of the TMG files in the current project.

 

When you make use of these features run them in the sequence:

1) Optimize

2) Validate File Integrity

 

Repeat step 2 until VFI tells you "that no potential problems have been found" in your current project. Then "Optimize" again.

 

I personally run them each time I've entered/changed data in TMG - say after each "TMG session" before closing TMG.

Share this post


Link to post
Share on other sites

Kristina,

 

I only run it from time to time when I have done a lot of changes. I have seen that "moving" (many) exhibits from one folder to another consumes a lot of space. Just guessing: Any "move" or "change" may involve the deletion and new creation of the information where the deleted space is not reused.

 

Anyway, after optimizing you get an information how much space has been freed. This might help you to define your own intervals.

 

Regards

Helmut

Share this post


Link to post
Share on other sites

I will run them every time I do a large edit. Especially if I am deleting data.

 

Here's how I look at it:

 

I am working at my desk and have a file folder open, scanning documents. They are all jumbled up on my desk. At the end of the day, I gather them up, sort them out, refile-- that is reindex.

 

Same scenario, but at the end of the day, I sort through them, throw some away, refile some in another folder--that is optimize.

 

Same scenario, but I go through the folder and make sure these papers really do go in this file folder. I get rid of the pretzels that fell into the folder at lunch time. I removed a note to myself about another line and put it into the correct folder. --that's VFI

 

Running VFI ensures that the data is organized, where it's supposed to be, all the pointers are in the right places and that the data is ready to go for the next editing session. Optimize removes any data that you deleted for good. It's gone. Running these regularly keeps TMG efficient, just like a clean desk keeps you efficient.

Share this post


Link to post
Share on other sites

Do I get the impression that it is not unusual for Validate to find "potential problems" to fix? I'm in discussion with support at the moment about another problem, but in the process I have been running Validate at the start of each TMG session. And maybe 1 run in 4, the Validate finds typically another half-dozen "potential problems" to fix.

 

The Validate log (LastVFI.log) sheds no light on what problems are being fixed, but I'm not clear what the difference is between a "potential problem" and a "real problem". Are "potential problems" expected to arise in the normal run-of-the-mill use of TMG?

 

Bill R

Share this post


Link to post
Share on other sites

VFI performs data table maintenance. If you run it once a month, that's adequate.

 

Optimize removes deleted records and compacts the data tables. Most of the size change you see when you optimize is from the change in the index files, not in the data tables.

 

Any time that you run VFI and it makes a change, you should immediately follow that by running optimize.

 

It would be very unusual that you would need to run the VFI / Optimize sequence more than once.

Share this post


Link to post
Share on other sites

This falls under the "why".

 

Why is 90% of the time spent in my file validation process (hours) consumed on checking witnesses, even though there aren't any to speak of (< 10)?

 

It's checking for principals that are not witnesses at the moment.

Edited by retsof

Share this post


Link to post
Share on other sites

All individuals linked to event tags are witnesses. And all will appear in the witness table. Witnesses can be divided into the subsets of principals and 'other witnesses'. 'Other witnesses' are what you are referring to as 'witnesses'.

Share this post


Link to post
Share on other sites
This falls under the "why".

 

Why is 90% of the time spent in my file validation process (hours) consumed on checking witnesses, even though there aren't any to speak of (< 10)?

 

It's checking for principals that are not witnesses at the moment.

 

Over half the time consumed by VFI on my data is on checking for principals that are not witnesses. The time taken is very annoying. (moderate size dataset of about 83,000 persons)

 

When this check was added a couple of years ago it found and fixed over 100,000 errors the first time it was run. Since then it has found no error. This check should be removed or made into a preferences option.

 

Best wishes,

Share this post


Link to post
Share on other sites

Mike,

 

Did you read what I wrote above?

 

Again... All individuals linked to tags are 'witnesses' and all of the witness subset 'principals' are in the Witness data table.

 

Jim

Share this post


Link to post
Share on other sites

Mike,

 

Are you being confused by a message from VFI that says "checking for principals that are not witnesses"? I would guess that the meaning of that message is, as Jim keeps repeating, all principals should be witnesses and this check is making sure that they are.

Share this post


Link to post
Share on other sites
Mike,

 

Are you being confused by a message from VFI that says "checking for principals that are not witnesses"? I would guess that the meaning of that message is, as Jim keeps repeating, all principals should be witnesses and this check is making sure that they are.

 

Yes I was, thank you for the translation.

 

Now, may I translate my previous question: Evidently, such an error is not severe, since my TMG functioned fine for years with hundreds of thousands of them. Since no more of this type error has been found on my TMG in more years, and checking for them takes up an annoying half+ of the long VFI run time, can that check be eliminated or circumvented sometime in the future?

Thanks, again,

Share this post


Link to post
Share on other sites

VFI is checking data table integrity and there is no way to make any judgments as to what data table integrity needs to be checked so there won't be any options appearing for Validate File Integrity. If it takes a long time to run, let it run at night while you're asleep.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×