Syncronizing Global Resources

Intro

A back end CUFTS version is run just for managing global resources. It can be backed up before title lists are loaded so any load problems can be backed out without affecting a production system. This back end system can dump changes to global resources in a simple (XML?) format along with a dump of the global title list for any updated resources. Title list loaders on the front end installations read the global resource record updates and load the title lists. Front end loading scripts are changed to allow bypassing the resource specific parsing system since they will be pre-parsed by the back end global resource system. This should allow for automation in the future.

Changes Necessary

  • A "unique global identifier" will have to be added to the global resources database. This allows sites to load their own global resources if necessary without worrying about the database id sequence matching the global resource system.
  • Resources will need a "deleted" flag. Or maybe we don't allow real deletes and just mark global resources as "inactive"?
  • System will have to properly set the "modified" timestamp on resources

New Things

  • A new title list loader that can load the resource change file as well as update title lists using the internal format bypassing title list parsing built into the resource module

Questions/Notes

  • Are "journal auth" records part of the global data? In theory they should match exactly if built locally, however without exporting they wont pick up manual changes to the global data. They also will drift if local or "local global" title lists are loaded.
  • Use local sites and local resources to create subsets of the global resources for specific installs? This should work.
  • The main SFU front end install should use the same process