Jump to content
MMj

Question about Validate File Integrity

Recommended Posts

I have a general question about the validate file integrity routine. I've been using TMG for about 3 months now and have a relatively small database at this point with about 2,500 people in a single dataset in a single project. I originally started using TMG with v6.09. During the 3 months that I have been using TMG I have run the validate file integrity routine fairly regularly just to make sure I didn't have any "issues", and up until today the routine has run through without finding anything to fix.

 

This morning I upgraded the software to v6.10. The upgrade performed flawlessly. I noticed the note stating that changes had been made to the validate file integrity routine to correct additional problem areas, so I ran validate file integrity shortly after performing the upgrade. This time, instead of a clean run, I got a message telling me that something like 2,900 potential problems had been corrected. I ran the validate file integrity routine again immediately and got a clean run the second time.

 

I was caught a little off guard by the large number that had been reported. So my question is this. What sorts of things is the validate file integrity routine now looking for that would cause my small-scale project to report nearly 3,000 potential problems? Were there problem areas in my original data that I didn't even know about? And when the routine says that it has "corrected" possible problems, what exactly does that mean? Have any alterations been made to the data?

 

I am just grasping here to try to understand what goes on when I run the validate file integrity routine.

 

Thanks

Mike

Edited by MMj

Share this post


Link to post
Share on other sites
I have a general question about the validate file integrity routine. I've been using TMG for about 3 months now and have a relatively small database at this point with about 2,500 people in a single dataset in a single project. I originally started using TMG with v6.09. During the 3 months that I have been using TMG I have run the validate file integrity routine fairly regularly just to make sure I didn't have any "issues", and up until today the routine has run through without finding anything to fix.

 

This morning I upgraded the software to v6.10. The upgrade performed flawlessly. I noticed the note stating that changes had been made to the validate file integrity routine to correct additional problem areas, so I ran validate file integrity shortly after performing the upgrade. This time, instead of a clean run, I got a message telling me that something like 2,900 potential problems had been corrected. I ran the validate file integrity routine again immediately and got a clean run the second time.

 

I was caught a little off guard by the large number that had been reported. So my question is this. What sorts of things is the validate file integrity routine now looking for that would cause my small-scale project to report nearly 3,000 potential problems? Were there problem areas in my original data that I didn't even know about? And when the routine says that it has "corrected" possible problems, what exactly does that mean? Have any alterations been made to the data?

 

I am just grasping here to try to understand what goes on when I run the validate file integrity routine.

 

Thanks

Mike

 

There is an ongoing thread on the tmg-l list that answers some of these questions. Jim and Lee both offer insights into what's happening:

 

http://archiver.rootsweb.com/th/read/TMG/2006-10/1160527583

 

Virginia

Share this post


Link to post
Share on other sites
I have a general question about the validate file integrity routine. I've been using TMG for about 3 months now and have a relatively small database at this point with about 2,500 people in a single dataset in a single project. I originally started using TMG with v6.09. During the 3 months that I have been using TMG I have run the validate file integrity routine fairly regularly just to make sure I didn't have any "issues", and up until today the routine has run through without finding anything to fix.

 

This morning I upgraded the software to v6.10. The upgrade performed flawlessly. I noticed the note stating that changes had been made to the validate file integrity routine to correct additional problem areas, so I ran validate file integrity shortly after performing the upgrade. This time, instead of a clean run, I got a message telling me that something like 2,900 potential problems had been corrected. I ran the validate file integrity routine again immediately and got a clean run the second time.

 

I was caught a little off guard by the large number that had been reported. So my question is this. What sorts of things is the validate file integrity routine now looking for that would cause my small-scale project to report nearly 3,000 potential problems? Were there problem areas in my original data that I didn't even know about? And when the routine says that it has "corrected" possible problems, what exactly does that mean? Have any alterations been made to the data?

 

I am just grasping here to try to understand what goes on when I run the validate file integrity routine.

 

Thanks

Mike

 

Explanation posted by Lee Hoffman on TMG-L

 

The "fixes" that are encountered here are of the kind that cause oddities in your data where the data is correct but the display makes it appear wrong or out of sequence or something like that. In most cases, these are index problems or display problems. The file structure of TMG is such that some data is stored in two places to speed up the program. One place is the actual data and the second ("speed up") place is a copy of the actual data. At times (power or operating system glitches, etc.), the updating of the "speed up" copy may be being done and the copy is not correct. The Validate File Integrity (VFI) process checks the copy data against the actual data and when it finds them different, it then corrects the copy from the actual.

 

 

The VFI process now does even more checking that in the past and thus the first time that you run the VFI process, you can expect a large number of fixed issues. I suggest running the VFI a second (or third) time to get a "no problems found" notice.

Share this post


Link to post
Share on other sites

The reason for the high VFI count for the first run under v6.10 is explained below. This one reason will account for most or all of the new VFI 'fixes'.

 

Blank dates/sort dates in the tables should have a dummy date value entered into the field. In many cases, the fields were actually left blank. This didn't cause any program errors because the code treated both conditions the same.

 

However, we have spent considerable time identifying and correcting anomolies in the tables and this was one. There was code being written that would be cleaner and work faster if all data/sort date fields with blank dates contained the same dummy 'blank date' value. VFI now corrects this.

 

There is likely still code that leaves blank date fields rather than entering the blank date value and we're still tracking those down and fixing them when found.

 

As Bob said in the change log and newsletter:

"It is not unusual and no cause for concern if the first use of VFI in v6.10 produces many "fixed" issues that were not previously reported."

Share this post


Link to post
Share on other sites

Thanks for the responses. I also took a look at the TMG-L thread and now have a better understanding of what is taking place.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×