Transparency

Transparency is crucial for informed decision-making, and EBSCO is committed to providing clear labeling and explainable AI features.

transparency icon


Our Approach to AI Transparency

Transparency can be characterized as:

  • Interpretable — being able to explain or tell the meaning of how something was created, selected, improved, maintained, or discontinued, presented in understandable terms
  • Available — being able to access information in order to make decisions on the information
  • Comprehensive — being able to explain something from beginning to end, from different facets, and from various levels of depth
  • Accountability — the ability to follow, trace, verify, or review the processes, actions, use of, and decisions throughout on a regular and repeatable cadence.

Having explainable AI means the metrics used to assess the AI and the way the AI is used, designed, and maintained is also transparently presented to end-users. Explainable and transparent AI allows users to determine if an AI feature can be used and if the AI feature will meet their needs.

Trust Through Transparency

Our updated terms of use for EBSCO products indicate that due to IP and copyright, publisher data cannot be used via EBSCO APIs for library and university AI.

Copyright, Access and AI

Artificial Intelligence (AI) used responsibly can help do amazing things but be sure you understand the implications first. Learn more about how copyright and AI interact in this video. 

Stay Informed

Contact us to learn more about AI at EBSCO, sign up for our AI beta programs, or collaborate with us on research and development initiatives.