2008-2009

Review of Technical Services Activities 2008-2009

1. Allocation of Staff Responsibilities:

a) Immersed all units in Cross-Training to work with bibliographic, checkin, order and item records. Integrated processes and trained staff on a start to finish workflow whenever possible, instead of passing work back and forth from one staff to the next. Every staff had job specializations, but each was encouraged to gain an understanding of a broad spectrum of Technical Services operations.

i. Streamlined workflow for same staff to place orders, receive, batch load or download record when needed, process invoice and catalog for videos, firm orders and foreign language approvals.

ii. Serials invoicing staff verified and placed serials orders, set up bibliographic and order records, had the license signed and processed the invoice.

iii. Every Technical Services staff to copy catalog when needed.

iv. Every Technical Services staff to verify URL’s when needed.

v. Established the “Train the trainer pyramid model”: Anyone experienced in a particular activity could be a trainer to another staff, e.g. four trainers could be training four other staff at the same time on a one-on-one basis. This raised the number of staff who could train and document procedures, and could be ad hoc team leads. It provided the training support for all staff assuming new responsibilities.

b) Moved senior library assistants to focus on complex tasks and to assume greater leadership role in designing, revising, implementing, and training other staff on new or revised work procedures. Re-deployed staff continuously based on needs and workflow efficiency to free up experienced library assistants for tasks requiring their expertise. Rush and replacements were re-assigned from LA5 to LA3.

          Training were offered to:

i. LA3 to copy catalog music upon the retirement of the LA5.

ii. LA3 to copy catalog CJK materials for LA5.

iii. LA3’s to upgrade subject headings previously performed by 3 LA5’s, using Millennium’s automatic authorities processing rather than manual global update.

iv. LA3’s to upgrade old serials records for LA4.

v. LA3 to add missing analytic bibliographic or item records discovered through the Google Digitalization Project.

vi. LA3’s to place orders and load bibliographic records for orders.

vii. LA3 to share with LA4 in the monthly generation of collection title statistics.

viii. LA3 to share in the use of MarcEdit for batch record update.

ix. LA3 to copy catalog computer games.

x. LA4 and LA3 to copy catalog continuations.

xi. LA4 in Maps to copy catalog maps upon retirement of the LA5.

xii. LA4 to copy catalog videos, originally performed by LA5.

xiii. LA4 to copy catalog DVD, originally performed by LA5.

xiv. LA5 to original catalog Special Collections serials.

xv. Library school interns to catalog 220 Stanley Map Archives (4,000 maps and aerial photos).

xvi. Another library school intern to original catalog materials.

xvii. Merged serials and acquisitions student job description and experimented with students doing serials checkin.

xviii. All staff needing copy cataloging or link verification training

c) Introduced work sharing among staff so that teams could be easily assembled to tackle bottlenecks and to provide fast turnaround response time. Acquisitions staff could switch to copy catalog when ordering was down and Electronic Resources staff could work in Acquisitions when ordering was up. Each staff was given a diversified range of responsibilities. Monopoly of work knowledge was kept to a minimum. This expanded the number of staff able to perform each core function.

          The number of staff who could:

i. Copy catalog monographs increased from 6 to 12;

ii. Copy catalog serials from 1 to 6;

iii. Copy catalog both monographs and serials from zero to 6 (a major breakthrough to cross the difficult boundary between monographs and serials);

iv. Check electronic links from 4 to 12.

v. Work in both Acquisitions and Cataloging from 4 to 11.

vi. Work in all three units from zero to 5.

d) Optimized tasks through automation and trained all staff in batch processing: Many routine batch processing tasks, using saved searches, templates, macros, constant data and global updates, had been developed. Help from IT staff was requested when needed, e.g. installing plugins in all 64 iMACS in Info Commons, Media Center and Cowell Room instead of coding browser preferences in Cruzcat OPAC for MAC users. The size of the OUTRECS file had to be expanded to 3 million (from 200,000) records and the transaction file to allow 2.5 million transactions.

         Most of the following were new services:

i. Daily verification of Science & Engineering locations against their call number range to ensure accurate location assignments.

ii. Weekly OCLC holdings adds and deletes to preserve synchronization between WorldCat and Cruzcat.

iii. Daily creation of serials file for use in database quality intervention.

iv. Weekly Local Holdings Records submission to OCLC.

v. Weekly item price updates.

vi. Weekly global update of local note tags to protect them from overwrites.

vii. Routine verification of itypes with status codes in item records.

viii. Verification of infrequently used or obsolete location codes.

ix. Routine Millennium automatic authority processing

x. Routine global updates of closed dates in personal name authority heading.

xi. Promoted the use of the web-based Help-Desk Request Tracker ticketing system, which had proven to be a cost-effective approach for our management of electronic resources. It was a permanent database of email records to document acquisitions, serials/electronic resources and technical services transactions and to communicate with staff, users, vendors, public services and CDL, providing multiple staff access for work sharing and prompt response to any queries to the system.

xii. Used various technologies to provide relevant system generated reports by extracting data from Cruzcat and/or vendor management databases for staff use in projects or routines, securing significant time savings, often from weeks to a matter of hours, e.g. exported order numbers from Ebsconet and cross-matched with order record numbers from Cruzcat to update subscription formats.

xiii. Employed OCLC macros in different cataloging operations, e.g. to automatically create online records from print records instead of manual entry in OCLC, or to alert staff of record errors. This had been applied to creating records for the Google digitalized items as well as to publications turning digital. This greatly reduced the operation to a couple of minutes per record.

xiv. Established procedures to move bibliographic records between Cruzcat and WorldCat in batches, using freeware like MarcEdit and Tera Term. This reduced record creation from months to a couple of days, depending on the number of records involved

xv. Used Terra Term, AutoIt to batch create checkin records to optimize our cataloging of electronic resources and to simplify our weekly LHR batch update process.

2. Partnering With Vendors:

a) Used Ebsco to determine tax status of serials for UCSC to save staff time in identifying, recording and reconciling tax payments one by one.

b) Set direction towards consolidation of primary monograph vendor and reduced the use of procard purchases for both firm and rush orders.

c) Increased contact with monograph vendors,

d) Explored vendor tools for electronic resources and optimal approaches to manage local and consortial purchases.

e) Explored Millennium ERMS for electronic resources management, looking for optimal ways to populate and maintain data, e.g. using the SFX KnowledgeBase, vendor licensing data and other campus strategies.

3. Collaborating with Collections Development:

a) Worked with the Head of Collections Development under the new library organizational structure to explore cost effective ways to collaborate on issues affecting both departments, e.g. serial order identity and format codes, serial payment start and end dates.

b) Creation of the replacement database to streamline replacement ordering decisions.

c) Compiled journal usage statistics data for Collections Development in record time to provide the needed information for the serials cancellation project.

4. Bibliographic Control of Collections: Our Reclamation Project radically increased UCSC's local holdings in WorldCat from an unacceptable 38%, created accurate record linking between Cruzcat and WorldCat, improved record match points for SCP records and enhances many old records, especially serials, allowing us to drastically shorten the time for our weekly SCP loads. We took various measures to dramatically improve the bibliographical control of the UCSC collections, by synchronizing holdings among multiple networks, enhancing access, linking and database quality, and greatly increasing efficiency in both backend management and in bibliographic production and maintenance.

List of activities and projects:

1. Synchronized Holdings Among WorldCat, Cruzcat, Melvyl, SFX & A&I Databases:

a) Completed OCLC Reclamation Project for 1.15 million locally cataloged records within 6 months (April-October, 2008) with an excellent match rate of 99.64%, by mobilizing the whole department and involving millions of record transactions.

i. Pre-Reclamation planning: Extensive record clean up, serials record upgrade, and testing. We had to even involve Reserves staff at one point.

ii. Took advantage of the Reclamation to improve Cruzcat data quality, e.g. inserted missing title change tags (780/785) into serials records to identify them as potential problems; inserted missing issn, isbn or government document numbers; loaded 050/090 into tag 950 to improve the generation of title collection statistics based on science and non-science range.

iii. Post-Reclamation clean-up: Technical Services, Special Collections, Map and Government Publications shared the work in fixing the majority of the 1,254 unresolved monograph records within one week. Five staff took one month to upgrade half of the 2,946 unresolved serials records while the serials cataloger stayed on to complete the rest of the project within a year.

iv. Ongoing record maintenance: Used global updates to insert special markers into the bibliographic records to identify which records need to be submitted for ongoing setting of OCLC holdings.

b) Completed the Reclamation and clean-up of 420,000 SCP records within a couple of weeks using mass delete/mass add approach. We took the opportunity to improve on the consistency of the record match points to facilitate automate overlay in the future.

c) Completed the Reclamation of vendor record sets from ebrary (40,000 records) and Oxford Scholarship Online (266 records).

d) Developed an efficient method to enter original records into OCLC for close to 200 Cruzcat records with no OCLC copy. They were batch downloaded from III for global edit via MarcEdit, then batch uploaded to OCLC to create totally new OCLC records and then batch downloaded again with all the corrections and the OCLC numbers to overlay the original Cruzcat records. This was a first step towards cataloging at the network level (to keep Cruzcat and WorldCat in sync). Manual entry would take weeks or even months, but this was accomplished within a couple of days.

e) Developed an efficient procedure to batch search and download records in the other direction, from OCLC to III.

f) Tested various WorldCat Local features, e.g. item status. Established weekly batch OCLC holdings updates for both adds and deletes.

g) Manually maintained coverage data and working links for subscriptions in Cruzcat and in the SFX KnowledgeBase, allowing users to access online from various sources, like Abstract & Indexing databases.

h) Synchronized Cruzcat bibliographic records, location codes and prompts with those in Melvyl and WorldCat, and cleaned up obsolete Cruzcat codes.

i) Established an optimal procedure to export Local Holdings Records (LHR) from Cruzcat to WorldCat by working with III using Millennium’s output tables and then asking OCLC to programmatically massage data in our LHRs, allowing us to eliminate the MarcEdit export /global update /re-import steps.

2. Enhanced Bibliographic Access Points to Facilitate Information Retrieval

a) Reviewed the importance of proper indexing in batch record loading, automatic overlay, machine linking among several systems and requested III for enhancement of their indexing practice.

b) Retrospectively batch loaded title hooks (e.g. NetLibrary Legal Collection, Accessible archives, Poiesis Philosophy online serials) and ensured that we had the proper records for each collection. This might involve cross-matching thousands of records to remove inactive records.

c) Downloaded names of serials publishers from vendor sites and cross-matched them with exported data from Cruzcat in FileMaker Pro and then reloaded them into the Cruzcat bibliographic records for staff use. Cross-training allowed staff to deal with data from both the bibliographic and order records.

3. Ensured Correct Linking to Full Text for Convenient User Access

a) Explored ways to identify broken links using various free link checkers on the web (Marcxgen, Xenu), in MARCEdit and in Millennium.

b) Worked with Government Publications staff to fix SCP California document URLs in both Cruzcat and OCLC and allowed SCP to re-distribute the good records back to other campuses.

c) Ensured working links were set up promptly for e-resources purchases and explored ways to track such purchases.

d) Employed Boolean search procedure to identify SCP records for non-subscribed CDL Tier II titles in Cruzcat, instead of searching each from a title list which was difficult to maintain.

4. Cleaned Up Erroneous or Incomplete Data For Database Quality:

a) Cleaned up bibliographic records for guides accompanying microforms for McHenry Reference.

b) Corrected missing paging in existing bibliographic records by using data from other libraries in WorldCat local.

c) Worked with Access to clean up on-the-fly records. Worked with Special Collections, Government Publications and Maps to clean up suppressed records and records with duplicate OCLC numbers. Cleaned up records in various ebook vendor record sets, e.g. missing, obsolete or duplicate records from Oxford Scholarhship Online/ebrary/netlibrary; wrong matches between ebrary and netlibrary records.

d) Used Create List / global update to identify and fix records with problems, e.g. wrong or missing locations and itypes, missing 007, 920/850, improper 035, unintentional duplicates, incorrect order record coding unwanted item records, wrong barcode formats,

e) Dealt with diacritics and missing record problems resulting from the problematic OCLC hardware upgrade in late 2009.

5. Increased Efficiency of Bibliographic Production and Maintenance.

a) Expanded our source of records from OCLC and SCP to vendors and publishers, like ebrary, netlibrary and Gale. Customized each loader to handle various problems like inconsistent coding and invalid match points.

b) Revised record loaders for loading records from OCLC, Gobi, Blackwell, etc. Changed the match point from OCLC number to ISBN for certain record sets.

c) Revised SCP loaders to reduce weekly load time from 3-4 hours to 1/2 hour.

d) Explored the use of brief vendor records in OCLC for generating orders. Reviewed whether on order/in process records should appear in WorldCat Local.

e) Expanded YBP PromptCat record profile to accept more records for use by acquisitions and cataloging (from 90% to very close to 100% of the purchases).

f) Performed regular batch update of Marcive holdings instead of subscribing to GPO Ongoing DB Utility Holdings Records Service and saved the monthly subscription fee.

g) With the optimal batch process in place, we completed batch OCLC searching and loading of 2,300+ DRAM (Database of Recorded American Music) records within a couple of days instead of months and established a procedure for on-going updates.

h) The FTE hours saved enabled us to assign one dedicated LA3 to copy catalog Special Collections items. Streamlined process with Special Collections to provide clear guidelines for the LA3 to speed up bibliographic production.

i) By assigning the same LA3 receiving and cataloging the foreign language approval books ensured that these items would not lag behind. Now they moved as fast as the English language items in the cage.

j) Used WorldCat Local to look for call number data from other libraries’ records to complete new cataloging.

k) Created temporary brief Cruzcat record for every title with no OCLC copies and flagged them for easy retrieval. Performed monthly routine searches in OCLC to look for copies.

l) Explored batch overlay of Blackwell brief order bibliographic records with full OCLC records, using OCLC batch searching/batch download strategies.

m) Implemented Shelf Ready of CJK firm orders.

n) Established procedures to outsource original cataloging to OCLC.

o) Explored Shelf Ready for firm orders with YBP.

6. Improved Coding in Records for Cost-Effective Backend Management: Preserved the integrity of the fixed field codes by creating or updating code values, display prompts and definitions. These codes had to be properly employed by all departments as well as meaningfully displayed and accurately linked to enable efficient global updates and record extraction, e.g.

i. Bibliographic CAT DATE vs CREATE DATE.

ii. Item itypes for Library Use Only materials, item status.

iii. Checkin codes for tracking cancellations and electronic access.

iv. Serials order codes for specifying subscription formats.

v. Invoice importing codes.

Technical Services