“Library Integration into Institutional Learning Analytics”
IMLS, National Leadership Grants for Libraries, 2017
LIILA is a one-year National Forum grant designed to increase academic library involvement in institutional learning analytics and develop a detailed plan to prepare academic libraries to engage in this emerging and important use of data to support student learning and success.
Because higher education exists to educate students, academic librarians have engaged in learning assessment efforts for many years. Now, as institutions of higher education commence and commit to learning analytics initiatives, librarians need to explore and embrace emergent institutional learning analytics tools, systems, and strategies as well. Learning analytics “is the measurement, collection, analysis, and reporting of data about learners and their contexts, for the purposes of understanding and optimizing learning and the environments in which it occurs.” Essentially, learning analytics employ data to improve learning contexts and help learners succeed. Learning analytics help educators discover, diagnose, and predict challenges to learning and learner success and point the way to successful and active interventions to benefit students. The LIILA project will spearhead the creation of the vision, strategies, and concrete plans required to ensure that librarians initiate involvement in institutional learning analytics and continue to serve as anchors of higher education communities focused on ensuring student learning and success.
LIILA seeks to achieve four goals:
1) increase librarian awareness of and engagement in learning analytics;
2) craft a detailed plan for integrating academic libraries into learning analytics initiatives that support student learning and success;
3) develop sustaining learning analytics partnerships and collaborations among academic librarians, educational technology lynchpins, institutional and library IT professionals, and library vendor communities; and,
4) explore, design, and develop library use cases and data profiles based on learning analytics standards that can be used to integrate library data with institutional data stores.
Project activities include:
1) A literature and environmental scan will increase understanding of the role of academic library data in institutional higher education learning analytics initiatives.
2) A National Forum will be convened over three meetings.
3) Findings and conclusions from the meetings will be disseminated to the academic library and higher education community via rapid informal means, a formal white paper, and conference presentation proposals; feedback on each will be solicited.
The LIILA project coalesces academic library and higher education leaders and experts around common goals: articulating a vision for library inclusion in institutional learning analytics, devising strategies for bringing the vision to fruition, developing use cases that lead to increased library value and impact on student learning and success, and creating the technical plans necessary to initiate action. Through these actions, LIILA will:
- advance the role of libraries as anchors within their higher education communities,
- enable libraries to provide indispensable data to augment institutional understanding of student learning in higher education, and ultimately,
- facilitate student learning and success by contributing to the identification, development, and assessment of the curricular and instructional improvements resulting from learning analytics initiatives.
“Rubric Assessment of Information Literacy Skills (RAILS)”
IMLS, Laura Bush 21st Century Librarian Early Career Research Grant, 2010
RAILS is an IMLS-funded research project designed to investigate an analytic rubric approach to information literacy assessment in higher education. The RAILS project is intended to help academic librarians and disciplinary faculty assess information literacy outcomes. Over three years, RAILS will yield a suite of rubrics that can be used by academic librarians and disciplinary faculty to assess information literacy outcomes; a transferable model for analyzing rubric scores; training materials for librarians, faculty, and LIS students who seek to use rubrics for information literacy assessment; indicators of rater expertise in rubric scoring; and a clearinghouse for librarians and faculty to share:
- local adaptations of IL rubrics,
- rubric assessment results,
- improvements to instructional strategies and services made on the basis of those results, and
- examples of increased student learning resulting from instructional improvements.
Although RAILS is intended to address practical assessment issues, it also will explore the answers to several research questions:
- Can librarians & disciplinary faculty use IL rubrics to provide valid & reliable scores of student learning?
- What skills/characteristics do librarians & faculty need to produce valid & reliable scores using IL rubrics?
- What training materials do librarians & faculty need to acquire these skills/characteristics
- How can rubric assessment be used to improve IL instruction and services?
- How can rubric assessment increase student learning of IL skills?
RAILS is funded by the Institute for Museum and Library Services. RAILS operates in partnership with the ACRL Assessment Immersion Program and is augmented by Waypoint Outcomes. More information is available at www.railsontrack.info.
Grant Participation Summary
- 2017 IMLS, Library Integration in Institutional Learning Analytics (LIILA), Syracuse University, Principal Investigator
- 2017 IMLS, Community College Library Support for Student Success, Northern Virginia Community College, Advisory Board
- 2017 IMLS, Library as Research Lab: Immersive Research Education and Engagement for LIS Students and Library Professionals, University of Michigan, Advisory Board
- 2013 IMLS, CRADLE: Curating Research Assets and Data using Lifecycle Education: Data Management Education Tools for Content Creators, Librarians, and Archivists, University of North Carolina at Chapel Hill, Advisory Board
- 2012 IMLS, Assessment in Action: Academic Libraries and Student Success, ACRL, APLU, and AIR, Advisory Board
- 2011 IMLS, Data Information Literacy (DIL), Purdue University, Advisory Board
- 2011 IMLS, Educating a New Generation of E-Scientists through Developing a Data Information Literacy Curriculum, Syracuse University, Advisory Board
- 2011 IMLS, Building Capacity for Demonstrating the Value of Academic Libraries, ACRL, APLU, CIC, and AIR, Program Designer & Facilitator
- 2009 IMLS, Rubric Assessment of Information Literacy Skills (RAILS), Syracuse University, Principal Investigator
- 2008 IMLS, Building an eScience Librarianship Curriculum for an eResearch Future, Syracuse University, Co-Principal Investigator
- 2007 NSF, CI-Facilitators: Information Architects across the STEM Disciplines, Syracuse University, Co-Principal Investigator
- 2007 IMLS, Building & Sharing Knowledge of Good Practice through the IMLS Clearinghouse, Syracuse University, Co-Principal Investigator (awarded, then frozen)
- 2006 NCSU, Program Assessment Grant, North Carolina State University, Principal Investigator