NASIG Transforming the Information Community

Knowledge Bases: The heart of resource management

  • May 20, 2020 8:00 PM
    Message # 8983131
    Anonymous member (Administrator)

    Recorded Session

    Elizabeth Levkoff Derouchie, Beth Ashmore, and Eric Van Gorden

    Sched Link
    YouTube Link
    S
    lideshare Link

    This session will discuss the knowledge base metadata lifecycle, current and upcoming metadata standards, and the effect that knowledge bases have on discovery and e-resource management. The presenters will look at ways knowledge bases can be leveraged to create downstream tools for resource management and discovery. The session will also provide different perspectives on knowledge bases, including from librarians and product managers, as well as a discussion of the NISO's KBART Automation recommended practice and what this could mean for knowledge bases in the future. The session will also include a conversation regarding how leveraging knowledge bases can aid librarians in improving resource discovery within their own libraries and ultimately decrease the amount of time spent on metadata workflows. Through this presentation, we also aim to improve communication between the library community and metadata providers and creators.

    Last modified: June 08, 2020 12:00 PM | Anonymous member (Administrator)
  • June 10, 2020 5:10 PM
    Reply # 9029016 on 8983131
    Susan Davis

    Hi, I am just partway through the presentation and want to thank you for one of the best representations I've seen of a knowledge base . I also love the expanded e-resources life cycle in relation to the KB slide.  However, I mostly wanted to respond to the slide asking us for feedback on our views of KBs.  Overall I think "all of the above" but I appreciate that there are an incredible number of  standard packages, customized packages, different license terms (access to content starts with year you first subscribed or a fixed point in time, or the dreaded rolling 20 year backfile), not to mention format inconsistencies (is a series a serial or a monograph or both), that the data to be managed is enveloped in enormous complexity.  Timeliness is a pet peeve too.  A "new" to publisher J's catalog as of January may not show up in the link resolver's KB for 3 months.  Batch processes may be efficient, but not up to date. The new KBart automation service where real-time holdings updates are ingested into your link resolver is a huge advance.  I hope more publishers will be able to implement this service.  Now back to the presentation.

  • June 11, 2020 11:33 AM
    Reply # 9030841 on 8983131
    Matthew Goddard

    This was an important and timely session, thanks so much.

    One of the problems you touched on that I'd really like to see tackled -- or at least explored -- is the lack of an upstream flow for publisher metadata. What I mean by that is that all of our standard discovery metadata processes are designed to flow downstream from the publisher to the KB provider to the library to the end user. Of those four entities, the latter two own by far the most eyeballs but have the least direct input in the metadata. Metadata corrections are a huge inefficient time sink that take ages to be implemented, and it's extremely easy to conclude that it's not worth the time in many cases.

    Any time you have metadata flowing in two directions, things have the potential to get messy pretty quickly, so of course extreme caution is warranted. But even just streamlined reporting processes that can be kicked up to the metadata owner would be a huge improvement.

    What I expect though is that our publishers simply aren't able to allocate a lot of resources to addressing these problems; in the meantime, there are thousands of libraries eager to help.

    I wonder if "override"-type fields might be part of the solution. What if every field had an override option where a library could enter its own local fixes, even at the article level?  Taking it a step further, what if triggers were configured so that when the same string was entered into the same particular override field by X number of institutions across all a KB's tenants, that got automatically kicked up the chain to the upstream source as a high priority permanent fix?

    Thanks again for a thought-provoking presentation!

  • June 11, 2020 1:40 PM
    Reply # 9031142 on 8983131
    Beth Ashmore

    Susan,

    You make a great point about the lag between when a change occurs and when it shows up in a KB. There is another pre-recorded presentation by Greg Yorba, Keri Prelitz and Ilda Cardenas (https://youtu.be/J7bTQmFOqGY) and towards the end Greg is describing the process he has to take of creating custom entries in his knowledgebase while he waits for the community knowledgebase to get updated. That just doesn't seem like a good use of any librarian's/staff member's time, but it is necessary if you want to get that access out there. I totally agree that there is an opportunity there for KBART automation to send those customized knowledgebase updates as changes are made to a library's subscribed content. Reducing that lag is important thing for us to advocate for.

    Thanks for sharing,

    Beth

  • June 11, 2020 1:47 PM
    Reply # 9031148 on 8983131
    Beth Ashmore

    Matthew,

    That is a really neat idea of crowd-sourcing knowledgebase correction. I know I am guilty of making changes to my own knowledgebase that I know to be accurate, but not reporting it back to the vendor because I just couldn't be bothered. With your idea, I would know that my change would get flagged and if enough other people agreed with that change we would see it populated globally. I only have experience with EBSCO and 360 KB knowledgebases but i think something like this might be happening with Alma's knowledgebase but I am not sure if it is as streamlined as what you suggest.

    I will let Eric and Elizabeth talk about how this would look to a KB vendor, but you've got my vote!

    -Beth

  • June 11, 2020 3:31 PM
    Reply # 9031374 on 8983131
    Elizabeth Levkoff Derouchie

    I really miss having back and forth discussions. There are some really interesting points here.

    Susan – thank you for your feedback! I’m really glad that you’ve found the presentation to be helpful.

    You make a good point about timeliness. Sometimes there is an information lag between when a title is published and when the metadata is created, and I don’t know how much KBART Automation will fix that in practice. In the current environment, the comes down to communication.

    Which brings me to Matthew’s point. We do need to explore the upstream flow for publisher metadata more. Communication is already occurring in many cases between the KB and the publisher. As an example, I would point to the presentation from Tuesday, Walk This Way, where the Wiley representative discussed how migration information was communicated to various KB and discovery partners.

    But, I agree, librarians could be more active participants. Personally, I have taken the stance of, if I find a problem in metadata, I try to report it. These problems are teaching moments where we can explain to the vendor or publisher what is wrong and how they can fix it. Ultimately, if the vendor doesn’t know something is wrong, they can’t fix it, and they can’t plan to make it better. 

    You have an interesting idea. To me it comes down to the difference between customization and fixing a problem. If multiple libraries could benefit from a change in the metadata, that change should be considered. If there is an actual problem, and there is something very wrong with the metadata, it shouldn't take multiple libraries reporting that for the publisher to want to fix it. 

     


  • June 11, 2020 6:00 PM
    Reply # 9031620 on 8983131
    Steve Shadle

    Good overview.  I believe the more library operations and functionality we tie directly to the KB (with the ILS centered KB being one approach), the less practical or beneficial a KB Automation approach is.  Although I miss the knowledge and service that Serials Solutions provides, a big benefit of migration to Alma was no longer having a separate library-managed workflow for MARC record loads.  Activation = Discovery!  The last thing I personally want to do is to have to separately manage a separate publisher-specific workflow.

    I also understand the frustration the KB vendor suffers from having to serve as the mediator between publisher and librarian (which is one of the reasons why ExLibris has gone the route of the Provider Zone).  I have had occasional conversations with publishers that library customers are not consistent in their requests and have times asked publishers for completely different things.  Corrections to errors which very clearly violate established standards or recommended practices (eg, ISSN) is one thing; modifications that improve one library's workflow or service at the degradation of another's (eg, "local" changes to Alma Community Zone records), leave publishers in a position of not really knowing what they should be doing.  "Local" modifications should be managed locally.

NASIG
PMB 305
1902 Ridge Rd
West Seneca, NY 14224-3312

More Contact Information


© NASIG 2019

Powered by Wild Apricot Membership Software