Jump to content
efcharvet

Optimize File When Exiting TMG (Integrate Process)

Recommended Posts

Reading a few new user comments over the last few months - and thinking about my own TMG habits - has lead me to suggest this:

 

I nearly always execute File > Maintenance > Optimize before I shut down. It has just become a (good) habit I've gotten into.

 

From experience, I know that it keeps my TMG database from acting flaky due to the additions and changes I make nearly every time I work with it.

 

Adding an option to the Preferences menu that would integrate File Optimize into the shut-down process when exiting TMG would be beneficial. (Perhaps it should be the default.)

 

It would help a lot of users keep their TMG databases healthier and make strange database events less likely. - - It is already a built-in process in many competitive genealogical products.

 

Food for thought.

 

Earl

Share this post


Link to post
Share on other sites

It couldn't hurt, but should still be a preference. For my largest database (until it gets merged down), that would be a 20 minute shutdown on a fairly fast computer. If I hadn't updated anything, it could get annoying.

Edited by retsof

Share this post


Link to post
Share on other sites

It could be a preference choice with a popup either separate from or part of the one asking if you want a backup. Although there are times it bugs me, particularly if I'm opening & closing TMG several times for some reason, I still leave it as an 'in your face' reminder so I have no excuse to 'forget' to backup my data and my customizing in case I've made some new, improved changes ;)

 

B)

Joan

 

It couldn't hurt, but should still be a preference. For my largest database (until it gets merged down), that would be a 20 minute shutdown on a fairly fast computer. If I hadn't updated anything, it could get annoying.

Share this post


Link to post
Share on other sites
I nearly always execute File > Maintenance > Optimize before I shut down.

I'd also do a backup before shutting down.

 

How large is your file?

 

It's over 600,000 members, but contains many duplicates that need merging. The member size is fairly small and there are no exhibits as yet. If there are more than 100 tags per member, it really bogs down due to all of the connections. Backup size is about 100 Mb. The website http://thepeerage.com has 130,000 members that are larger on the average, so his backup is about 2 Gb. If I do any exibits, I would probably go with externally linked exhibits instead of internal. I have plenty of space locally and also plenty of web space if it gets that far. The file needs to be knocked down a bit first.

 

Another project file containing mainly ancestors and descendants of John of Gaunt Plantagenet is over 130,000 and also contains many duplicates, the result of downloading gedcoms in 6 to 10 generation pieces. Every now and then, due to all of the intermarriages, I get another unavoidable copy of Charlemagne's family. I have gotten better at controlling the number of generations in the download if I can't grab the whole gedcom from rootsweb (not allowed to). The source citations also multiply, so some sessions with the TMG utility will fix that.

 

I made an early mistake and copied and merged too many datasets into the main project instead of starting new projects. It took a couple of months to get comfortable.

 

It gets worse: A database validate is 8 hours on an AMD64 4000+ 2.4GHz computer (equivalent to Intel 3.0GHz+). It then takes another optimize and backup. The backup is only a few minutes, so that's not too bad.

 

I have an AMD64 X2 6400+ Black Edition 3.2GHz chip here, but I don't have all of the components yet to finish building it.

 

Project future feature HINT: Competitor "Legends" claims an automatic member merge if the two member sides are completely identical. The merge here in TMG frustratingly doesn't always show what's IN the lines, so I tend to go ahead with the merge. It takes awhile to delete things always line by line.

Edited by retsof

Share this post


Link to post
Share on other sites

retsof,

 

I can see where a project of that size could take a while. I am very big on not merging in another's gedcom. If someone sends me data, I handtype that information that is new. I rarely will add 100s of descendants at once like you are. You might want to try having a separate dataset for the new gedcom until you know it contains new information. Then you could add just those people that are new instead of everyone and then doing massive data cleanup to merge all the duplicates.

 

Our research styles are very different. I want to do the research myself using mostly primary sources and I can go months and not add a single new person. In fact with merging a few people in the last few days, I have less people than I did a month ago. You appear to be doing a one name study where the data is being sent in by large numbers of people.

 

I usually run optimize several times a session if I am deleting information, like this week when I merged a woman with her sister, when I realized they were the same woman. It only takes a few seconds on my 25,000 person database. My biggest issue is exhibits. I have them in one folder that is over 8 GB large (not all are attached to TMG). Backing that up takes a while. I am so thankful TMG allows me to store them outside the program (my old program did not) so they don't have to be backed up every time I close TMG.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×