We are currently using Scheduler for a couple different tasks. We have a few create lists for students with fines that get ran every night. These lists then get exported to an FTP server where one of our member academic libraries then pull the files out and update the students account with the financial office. We also have a process in use by two of our libraries where patrons loads happen automatically using scheduler. CSV files get sent to the FTP server and then Scheduler reaches in and pulls the file out and loads them using our desired load table.
We use it to export Marc records from the catalog to send to our discovery layer vendor (EBSCO).
Twice a week scheduler runs a list of new bibliographic records only and exports them directly to EBSCO's FTP server.
Once a month, we replace our entire catalog in the discovery layer (to account for deleted records as well as added ones). Scheduler exports all non-suppressed bib records in the catalog in a series of large lists to a local FTP server, and a staff member combines them into one file and FTP's them to EBSCO. (We've found that FTP'ing the large full-catalog lists directly to EBSCO's FTP server tends to fail due to the nature of EBSCO's process and the size of the files, but at least a staff member doesn't have to manually create and export the lists.)
We have used it for other things in the past (for example, exporting a weekly snapshot of books currently checked out to send to our consortium as part of a collection analysis study).
I use scheduler to send changed bibs to our discovery system daily. We only reload the entire database if we make an index change in the discovery system. I have my scheduler saved search set to only gather records with an update date of "yesterday" - so I'm only sending at most a few thousands of records. We are only ftping them locally, but I've never had a problem.
We just bought scheduler primarily for EDS record loads. We have been doing a complete reload every two weeks so I was trying to replicate the process with Scheduler (I'd be tempted to ask for our money back if the process turned out to still require staff intervention!).
Previously we were having to split the out files into 4 or 5 different lists, and output files. This wasn't due to the size of the list (our largest is set to 1,875,000) but to the previous limit in output file size in Data Exchange. A recent Sierra update (I think it was 2.4) increased this, so now we are running it in 2 list/output files. They average around 1.4 and 2.3 gigs each.
I was having a lot of trouble getting it to work with Scheduler. While it would work going to a local ftp site that I run, the files were not showing up on the EBSCO ftp site. Both EBSCO and iii support were no help, and it definitely doesn't help that the Scheduler email notifications would always say the process completed successfully when they hadn't. Does that happen to everyone else using Scheduler? Seems like I had seen that complaint before?
Anyway I finally thought to try checking the use secure ftp box and that seemed to do the trick! (This was strange because the EBSCO site is not a secure ftp site). So now I've got it set to run weekly and keeping my fingers crossed that it continues to work. I'll still have to check it manually because I can't trust the email notifications. Maybe I can live with that , but I'm going to try to push for a fix cause it seems like a simple thing for them to solve.
Joe, I am curious whether you can assist me with something (again!) - we're trying to set up Scheduler for our weekly upload to Summon, but of course Summon requires date strings  in their filenames (I'm assuming that is what is going wrong with our upload...although it might be something else?). Are you having Scheduler handle the FTPing, as well? If so, would you mind sharing the filename you give to Scheduler? E.g., in our task I have "vassar-catalog-updates.marc" - but (I think) I need it to also have the date string, a la sourceid-catalog-updates-`date +%F-%H-%M-%S`.marc. Weird question, but is it possible to get Scheduler to recognize the date variables and ... fill things in on the fly? ::crosses fingers::
For future reference, it appears that Summon is ingesting our uploads without having the date appended to the filename (i.e., vassar-catalog-updates.marc" - so far, so good! I'm going to keep an eye on things, but for about a week we've been uploading automatically (daily), and as far as I can tell, I've been able to successfully find newly-added items in our Summon instance. I'll update if this changes.
I use it for several weekly and monthly lists. I do think it’s a time saver. When I come in on Monday, my data is ready. It doesn’t take time to run them manually,
but when you have 4+ lists that you pull every week, that time adds up. And the yearly reports I schedule to run off hours so that I’m not waiting for them to complete. I use it for:
Marcive – we send them new bibs weekly
Novelist – we send updates weekly (I schedule this to run and export)
Pulling items marked withdrawn to suppress
Pulling a withdrawn batch process to mark withdrawn globally
Lost and paid accounts to delete
Online reg accounts to cancel if they haven’t activated their card
Hi we use the scheduler for most of our weekly and monthly housekeeping tasks. I believe it is a big time saver as the data is being refreshed when the libraries are closed and can be exported when we arrive at work. (we would love to be able to schedule the export as well)
Currently we use the scheduler for:
Portable RFID reader search lists- updated daily
Weekly and monthly lists such as ;
Items with "problem" statuses such as Repair, Damaged, Returned incomplete etc
New members signed up in the last week; libraries check the records to ensure details such as address and emails have been entered correctly
Interstate members due to expire (these members pay a fee to access our collection. We are located near an interstate border and we do have some people from over the border wishing to use our collections).
Items in transit for too long
Items claims returned
List of users with manual blocks or blocks due to long overdues at 1 of our mobile library stops that has poor internet access and staff often have to work offline
and lists of checked out items from our special collections such as book club kits and special needs equipment so coordinators of those areas can monitor them.
We have only been with Sierra for about 6 months so I guess we will also be scheduling our annual clean up lists such as the removal of temporary and long expired members as well.
Hi Catie, is there a reason why you can't use an Output Delimited Records Task to export your list list data to a local ftp server that staff could grab thier files from?
We'd love to be able to schedule the export of the lists but at this stage we don't have access to a local ftp server. We have had a request in for a while to get access to the one owned by council but due to the outsourcing of IT support and a general freeze on non-critical IT change requests (We are hosting the Commonwealth Games in about 6 months), we have been warned not to hold our breath waiting. Even if we do get access, it will be limited access for only a few in library admin.
Just an addition to my comments above.
We have our system set to look up call numbers for an item from the bib record and only include an item call number in the item record if it is different to the bib (eg a reference copy).
Since an upgrade last July, the scheduler is now no longer able to look up call numbers (even if the login scheduling the task has the look up call numbers setting ticked). So basically now scheduled item lists do not include call numbers which we have found to be a real pain.
The mobile worklists app however does show the call numbers for these lists if the list is imported onto the mobile device via the app.
Patron and bib lists still schedule without issue.
I would like to know how to use Scheduler for running daily lists for updating Summon as well. I have been working with Sierra/Millenium for a while but I am new to this aspect of it. Any assistance would be appreciated. Thank you.