1. Trang chủ
  2. » Giáo án - Bài giảng

Academic libraries and the academy

419 82 0

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

THÔNG TIN TÀI LIỆU

Thông tin cơ bản

Định dạng
Số trang 419
Dung lượng 6,65 MB

Các công cụ chuyển đổi và chỉnh sửa cho tài liệu này

Nội dung

Academic librarians who practice reflective assessment participate in “triple-loop” learning, thereby exploring whether they’re providing library services, resources, and spaces in the “

Trang 2

Association of College and Research Libraries

A division of the American Library Association

Trang 3

The paper used in this publication meets the minimum requirements of American

Nation-al Standard for Information Sciences–Permanence of Paper for Printed Library MateriNation-als, ANSI Z39.48-1992 ∞

Cataloging-in-Publication data is on file with the Library of Congress

Copyright ©2018 by the Association of College and Research Libraries

All rights reserved except those which may be granted by Sections 107 and 108 of

the Copyright Revision Act of 1976

Printed in the United States of America

22 21 20 19 18 5 4 3 2 1

Trang 4

ix Foreword

Megan Oakleaf

xiii Introduction and Context

Demonstrating Value through Library Assessment

303 SECTION 3 REACHABLE FRUIT

305 Chapter 20 Problems and Promises of Using LMS Learner Analytics for Assessment: Case Study of a First-Year English Program

Valerie Beech and Eric Kowalik

332 Chapter 21 Reframing Information Literacy Assessment: Reaching for College Seniors

Toni Carter and Dr Megan Rodgers Good

342 Chapter 22 Library Instruction, Net Promoter Scores, and Nudging beyond Satisfaction

Richard “Ted” Chaffin

353 Chapter 23 Gathering User Behaviors: Improving Library Space while Enhancing the Library’s Profile

Margaret A Fain and Jennifer H Hughes

367 Chapter 24 Constructing the Evaluation Mosaic of a Library Module for New Undergraduate Students

Diana M Finkle

Trang 5

iv CONTENTS

384 Chapter 25 Breaking the SEAL: Enhancing Engagement with Academic Libraries and the Academy through Educational Design Innovation in Technology-Enhanced Learning

Mary Fleming, Paul Flynn, Tony Hall, Barry Houlihan, Niall McSweeney

398 Chapter 26 Using Reflective Writing to Enhance the Student Research Process

Larissa Gordon

410 Chapter 27 Assessing the Effectiveness of Collaboration

Workshops in an Academic Library: A Mixed-Methods Approach

April Hines, Bess de Farber, and Michael LeDuc

427 Chapter 28 Transitioning from a Teaching to a Focused Collection in a Middle Eastern University: A Road Map for Future Directions

Research-Lilly Hoi Sze Ho

442 Chapter 29 Creating a Strategic and Flexible Assessment Framework for Undergraduate Student Outreach

Amanda Hornby and Emilie Vrbancic

456 Chapter 30 Value Assessment Strategies and Return On Investment of the Twenty First Century Libraries: Covenant University in View

Mercy A Iroaganachi, Michael O Fagbohun, and Nwanne M Nwokeoma

480 Chapter 31 Cracking the Code: Course Syllabi Unpacked, Decoded, and Documented for Evidence of Library Value

Colleen Mullally, Jeremy Whitt, and Casey Ann Mitchell

495 Chapter 32 Building a Case for the Replacement of a Legacy Library Management System

Shameem Nilofar

519 Chapter 33 When Numbers Are Not Enough: Using

Assessment toward Organizational Change

Nancy B Turner

529 Chapter 34 Assessment as Engagement: Understanding

Faculty Perceptions of Research at Trinity College

Erin Valentino, Rob Walsh, and Rachael Barlow

547 Chapter 35 Targeting Collection Assessment Data to the Intended Audience

Kimberly Westbrooks and Paula Barnett-Ellis

Trang 6

Contents v

567 SECTION 4 HARD-TO-REACH FRUIT

569 Chapter 36 Story Time in the Academic Library: Using

Assessment Evidence to Communicate Library Value

Amanda B Albert

586 Chapter 37 “We Only See What We Look At”: Sight as a

Metaphor for Exploring Student Library Use, Study Habits and Research Behaviors

Valeda Dent, Kim Mullins, Eamon Tewell, and Natalia Tomlin

596 Chapter 38 Longitudinal Information Literacy Skills

Assessment

Jessame E Ferguson and Robin Dewey

618 Chapter 39 The Maturing of a Big Library Data Project: OR How to Future-proof your Library Data and Student Success Project

Jan Fransen, Kristen Mastel, Shane Nackerud, Kate Peterson, and Krista Soria

633 Chapter 40 A Voice of Their Own—Letting Library Collections Tell Their Story: The UNT Libraries Collection Map

Karen Harker and Janette Klein

658 Chapter 41 A Story for the Ages: Staff Engage in

Reorganization by Reading a Decade’s Trend Data

Elena O’Malley

672 Chapter 42 Using Program Evaluation as a Proxy for

Assessment: Diffusion from Policy Literature to Improve

Academic Program Assessment

xiii Introduction and Context

Demonstrating Value through Library Assessment

Trang 7

1 SECTION 1 SEEDING THE INITIATIVE

2 Chapter 1 High-Impact Practices and Archives

Kyle Ainsworth, Jonathan Helmke, and Linda Reynolds

22 Chapter 2 Growing Our Field Evidence: Succession Planning for Sustainable Information Literacy Assessment

Amanda L Folk

34 Chapter 3 Connecting Student Success and Library Services

Diane Fulkerson and Jessica Szempruch

47 Chapter 4 Our “Special Obligation”: Library Assessment, Learning Analytics, and Intellectual Freedom

Sarah Hartman-Caverly

74 Chapter 5 Research and Writing in the Discipline: A Model for Faculty-Librarian Collaboration

Talia Nadir and Erika Scheurer

100 Chapter 6 Thinking LEAN: The Relevance of Gemba-Kaizen And Visual Assessment in Collection Management

Nazimah Ram Nath

114 Chapter 7 Delivering on the Institution’s Mission: Developing Measures for a Research Library’s Strategic Plan

Laura I Spears, Trey Shelton, Chelsea Dinsmore, and Rachael Elrod

135 Chapter 8 Begin Again

Holt Zaugg

149 SECTION 2 LOW-HANGING FRUIT

151 Chapter 9 Three Thousand Library Users Can’t Be Wrong: Demonstrating Library Impact Using One Open-Ended Survey Question

Jackie Belanger, Maggie Faber, and Megan Oakleaf

161 Chapter 10 Rowan University Libraries’ Head-Counting Study

Susan Breakenridge

174 Chapter 11 Measuring Accessibility and Reliability of a

Laptop-Lending Kiosk in an Academic Library

Hae Min Kim

vi CONTENTS

Trang 8

Contents vii

191 Chapter 12 Triangulating an Assessment Plan

Starr Hoffman

203 Chapter 13 Leveraging Research to Guide Fundamental

Changes in Learning: A Case Study at Kreitzberg Library,

Norwich University

Richard M Jones

218 Chapter 14 Answering the Question before It’s Asked:

Building a Library Impact Dashboard

Jacalyn Kremer and Robert Hoyt

229 Chapter 15 Closing the Gap: The Library in Academic Program Review

Bridgit McCafferty and Dawn Harris

240 Chapter 16 An Ounce of Performance Is Worth Pounds of

Promises: The Impact of Web-Scale Discovery on Full-Text

Consumption

Anthony J McMullen

251 Chapter 17 Show Them the (Data-Driven) Goods: A

Transparent Collection Assessment Tool for Libraries

Caroline Muglia

268 Chapter 18 Q-methodology: A Versatile, Quick, and

Adaptable Indirect Assessment Method

Eric Resnis and Aaron Shrimplin

282 Chapter 19 Assessing Discovery: How First-Year Students Use the Primo Discovery Tool

Karen Viars and Sofia Slutskaya

295 Author Bios

Trang 10

Foreword

Megan Oakleaf

Reflection is the hallmark of an effective practitioner At the core of reflection is a spirit

of introspection, a willingness to consider and question one’s own thoughts and actions

In professional roles, practitioners engage in reflection by considering the implications

of their actions and leveraging a sense of doubt Practitioners who allow themselves to doubt whether (or to what degree) their efforts lead to desired outcomes open a mental space through which awareness and learning may enter

In educational spheres, assessment is a key tool for reflective practice It is hard

to overstate the importance of assessment; it is the lifeblood of teaching and learning Without assessment, educators sever their relationships with learners, resulting in instructional efforts that succeed only by chance and may often fail to reach, support,

or empower learners In contrast, educational practitioners who conduct assessments 1) gain insights into the needs, goals, and values of their learners; 2) design learning experiences that meet students where they are, engage them in meaningful ways, and enable them to attain greater agency in their own lives; and 3) reflect and improve throughout each iterative teaching cycle, ultimately increasing the value of education for their present and future learners

As active contributors to the educational mission of their institutions, academic librarians can expand student access to learning, ensure students are able to persist and attain their goals, and scaffold student experiences to aid attainment of independent learning capacity They can support students as they develop productive self-awareness, metacognition, and self-actualization in a variety of contexts, including their immediate

learning environments, the broader community, and the world around them They can fulfill these educational roles; however, to ensure that they do, librarians must engage

in reflection and assessment Academic librarians who practice reflective assessment participate in “triple-loop” learning, thereby exploring whether they’re providing library services, resources, and spaces in the “right” ways, for the “right” reasons, and whether those “right” reasons align with professional convictions about information, education, and the role of libraries in higher education The act of engaging in deep assessment

as a reflective practice can be both revelatory and energizing for librarians, and the

Trang 11

x FOREWORD

results of such assessments have a number of uses: a guide for daily library making, a map for long-term library strategy decisions, and/or a communication tool for outreach to other members of the institutional community For these reasons, all librarians should engage in reflective assessment; both the process and product of assessment enables librarians to articulate, own, and enact their role within academic libraries and the academy, writ large Indeed, it is insufficient to deploy library services, resources, and spaces in hopes that they will contribute to student learning; rather, libraries must develop assessments to determine the degree to which their efforts contribute to student learning, use the results of their assessments to expand in areas that appear to make an impact on student learning, and re-imagine areas that do not This is the most important purpose of assessment of academic library contributions to student learning: to improve and expand the ways in which libraries and librarians help students learn Thus, the impact of the academic library on student learning is a vital component of library efforts to capture, convey, and communicate value Librarians who seek to establish, grow, acknowledge, support, and reward the ways in which libraries support student learning often need to demonstrate the value of their existing efforts as part of an ongoing cycle to ensure that library services, resources, and spaces can continue and expand their contributions to student learning

decision-Likewise, librarians must demonstrate their value in other contexts Academic libraries contribute to the success of their institutions in myriad areas: 1) faculty concerns such as teaching, research, grant seeking, and support for promotion and tenure; 2) institutional priorities including prestige or image, affordability, efficiency, accreditation, and preparation for changing student demographics; 3) community issues like development, inclusion, economic growth, and the education of an engaged citizenry; and 4) larger values including information literacy, critical thinking, and innovation Reflective assessment of the library’s contribution to these institutional missions may include projects that assess library collections, space, systems, and personal connections, such as faculty-librarian collaborations Related library assessment efforts may explore library user experiences, the role of libraries in institutional program review

or accreditation, return-on-investment analysis, or more inward-looking assessments

of organizational change or strategic planning processes

To this end, this book, Academic Libraries and the Academy: Strategies and

in search of pathways to get started, gain traction, and galvanize existing efforts to convey the value of the library Based on a case study approach, this resource collects and presents the lived experience of librarians across the globe as they seek to define, demonstrate, and articulate the value of their libraries Presented in four sections,

Academic Libraries and the Academy provides guidance for librarians at any stage of the

assessment and value demonstration process Helpfully, the authors have followed an established format in each chapter Each case begins with the context of an assessment, describes the library’s communication of the assessment results and impact, continues with explanations of the ways librarians leveraged their findings, and closes with librarian reflections on each assessment project The authors also provided useful

Trang 12

Foreword xi

information at the outset of each case, such as project foci, implicated data, selected methodology, timeframes, costs, and results Structured presentation makes this text unique among library assessment publications and an invaluable tool for libraries and librarians committed to reflection and the pursuit of demonstrated value Indeed, the range of projects, the deep treatment of each, and the organizational structures asserted

by the authors combine to make this publication an assessment handbook of sorts, one that makes library assessment practice accessible to newcomers, provides sufficient detail to guide established practitioners, and offers a scan of the library assessment environment sure to educate and excite assessment researchers and students alike

Trang 14

to evaluate and assess the qualitative and economic value of their various library sources, including services, programs, collections, and facilities.

re-Figure I.1

Orr’s evaluation model

Positive Effects Utilization

Capability Resources

Input Measures Process Measures Output Measures Outcomes

Quality Value

Trang 15

xiv INTRODuCTION aND CONTExT

Library assessment in academic libraries typically involves a wide range of activities Some examples are the required reporting of benchmarking data to various agencies, such as ARL (the Association of Research Libraries) as well as state and regional accrediting bodies, assessing library instruction and teaching, and determining the efficacy of library collections, programming, services, and facilities Most of these activities focus on input/output measures or what many call “counting.” However, more recently there has been a renewed interest in assessment activities in academic libraries that can demonstrate impact, value, and the return on investment One of the primary drivers of this interest, particularly in the United States, has been a steady decrease in public funding coupled with a dramatic shift in funding formulas for public institutions

of higher education

Funding in Public Higher Education Institutions

in the United States

In 2012, Thomas Mortenson, a senior scholar at the Pell Institute for the Study of Opportunity in Higher Education, projected that the average state fiscal support for higher education will effectively reach zero by 2059 (based on state appropriations trends for public higher education in the United States since 1980).2 Mortenson reported that state fiscal support in 2011 had decreased by an average of 40.2 percent compared to

1980, with some states reducing their funding for higher education by as much as 69.4 percent Comparing the 2015–2016 academic year to the 2007–2008 academic year when the recession hit, state spending nationwide in the US was still down more than

18 percent on average, with nine states having cut funding by more than 30 percent, and two states over 50 percent.3 Over the last few decades, there has been a trend for increasing numbers of state public higher education institutions to transition from state-funded institutions (more than 50 percent of their operating budget is funded

by the state appropriations) to state-assisted (less than 50 percent funded) This has been in sharp contrast to how other countries have invested in public higher education

In 2012, the Organization for Economic Co-operation and Development (OECD), a consortium of thirty-five member countries, found that across all thirty-five countries,

70 percent of the funding in higher education comes from public coffers, while in the United States, a mere 38 percent does.4

Historically, in the United States, state public institutions of higher education have been publically funded by state governments based on full-time student enrollments

In 1995, the US was first among OECD member countries in college graduation rates.5

Unfortunately, over the past two decades, US college graduation rates have steadily declined.6 In 2014, the US ranked nineteenth in graduation rates out of the twenty-eight countries OECD studied These and other low outcome measures, such as poor job placement rates and time to complete a degree, have disappointed legislators and policy-makers across the United States and prompted them to demand greater levels

of transparency and accountability for state funding of public education This has

Trang 16

Demonstrating Value through Library assessment xv

meant a shift toward outcome-based measures such as performance-based funding Miao defines performance-based funding as “a system based on allocating a portion

of a state’s higher education budget according to specific performance measures,”7 such

as graduate rates, transfer rates, time to degree, and the number of low-income and minority graduates As of January 2015, approximately thirty-five states had adopted a performance-based funding formula or were in the process of transitioning to one.8 The percentage of state funding allocated based on performance measures varies widely by state, ranging from less than 1 percent of base funding in Illinois to 100 percent of state funding in Tennessee (i.e., after a base amount is provided for operational support) Figure I.2 from the National Conference of State Legislatures shows the commitment

by state in 2015 to performance-based funding for higher education This trend toward performance or outcome-based metrics to award public funding is also evident in Canada’s higher education environment.9

Figure I.2

Performance-Based Funding for Higher Education by U.S State

Decreased student enrollments, diminished budgets, and the fiscal reality of declining state appropriations are forcing administrators to more closely examine the allocation of funds and resources across the institution The expression “doing

Trang 17

xvi INTRODuCTION aND CONTExT

more with less” has become an all too common mantra in higher education In the past, most academic libraries have benefitted from the assumption that since people perceived them as a public good, there was little expectation for them to have to justify their existence nor account for the spending of their budget But now, with increased expectations of accountability and transparency for budget expenditures, institutions scrambling to “do more with less,” and the emergence of new budgeting models that view units as either cost centers or profit centers, academic libraries are under new pressures and scrutiny Academic libraries, viewed in budgetary terms as a cost center

as they bring in little to no direct revenue, are realizing the incredible importance and necessity of clearly articulating to their institutional administrators their contributions

to institutional outcomes, their short-term and long-term value, and in essence, their return-on-investment This type of evidence-based advocacy tends to be new ground for many academic libraries in North America and around the world

Library Organizations Supporting the “Value of Academic Libraries” Initiative

Fortunately, academic libraries are not alone in learning how to build knowledge and capacity to advocate for and provide evidence of their value and worth to institutional administrators For more than a decade, a number of library nonprofit organizations have developed and launched robust resources and customizable materials to provide ongoing support and assistance to academic libraries in this effort These include the Association of College and Research Libraries (ACRL), the largest division within the American Library Association (ALA), and the Association of Research Libraries (ARL)

Association of College and Research Libraries

ACRL, the largest division of the ALA with more than 10,500 members, is a professional association of academic librarians “dedicated to enhancing the ability of academic library and information professionals to serve the information needs of the higher education community and to improve learning, teaching, and research.”10 Every few years, the ACRL Research Planning and Review Committee authors a document on top trends in academic libraries In 2012 and then again in 2014, the committee identified the value and contributions of libraries to student success and communicated this value

as a top trend and priority for academic libraries.11

ACRL has played an instrumental role in advancing the conversation on the value

of academic libraries ACRL’s five-year Plan for Excellence, implemented in 2011, focused on three areas, the first of which was Value of Academic Libraries The goal of this focus area was described as “Academic libraries demonstrate alignment with and impact on institutional outcomes.”12

In 2010, the ACRL Assessment Committee developed a Value of Academic Libraries Toolkit website to help academic librarians demonstrate the value of libraries to their users, institutions, and communities.13 The toolkit provided open access to related

Trang 18

Demonstrating Value through Library assessment xvii

bibliographies; studies, grants and reports; white papers and in-depth treatments; marketing tools and presentations; academic library ROI/value calculators; blogs; best practices; and assessment tools The committee was subsequently eliminated when ACRL decided to make the Value of Academic Libraries movement a priority in its focus (J Stein, personal communication, March 13, 2015) The website and toolkit, although still containing valuable research and documentation from 2010 and prior, has been superseded by an updated website: the ACRL Value of Academic Libraries website.14

The Assessment in Action: Academic Libraries and Student Success project, an funded initiative, has been an important and systemic project that ACRL first launched

IMLS-in 2012 This IMLS-initiative was designed to develop and sustaIMLS-in a professional development program for librarians to prepare them to lead collaborative campus efforts to assess and demonstrate the library’s impact on student learning and academic success on each of their campuses Over the span of the three-year project (beginning in April 2014 and ending

in June 2016), more than 200 institutions of all types participated in the project Several publications have been released documenting the success of this project and highlighting

a number of case studies.15 In 2017, ACRL launched a one-day workshop to build on the Assessment in Action curriculum and to focus on strategic and sustainable assessment The Assessment in Action website offers more details about this workshop, as well as the history of this initiative, including a list of interim and final reports from various participating institutions.16

Association of Research Libraries

The ARL is a nonprofit organization of 125 research libraries at comprehensive research institutions in North America that share similar research missions, aspirations, and achievements.17 ARL has been actively involved in promoting and supporting various Value of Academic Libraries initiatives The “Statistics & Assessment” section on ARL’s website states that its purpose is to “focus on articulating the value of research libraries by describing and measuring their performance and contributions to research, scholarship, and community college.”18 The section houses a number of ARL’s services and products directly addressing this topic, including the association’s ARL Statistics, ClimateQUAL, LibQUAL+, MINES for Libraries, StatsQUAL, and LibValue

Some Success, but More Work to Be Done

Despite these concerted efforts by ACRL and ARL and documented success for many academic libraries, challenges still persist for others In 2017, an ACRL research team reviewed 357 articles on library assessment and drew the conclusion that “librarians experience difficulty articulating their value to higher education administrators and other stakeholders… and use a small variety of methods, which may not match the methods relevant to senior leadership.”19 Furthermore, the team recommended that

“librarians and library administrators must continue to develop best practices and effective documentation to demonstrate value and be willing to share these practices and documentation cross-institutionally.”20

Trang 19

xviii INTRODuCTION aND CONTExT

Our set is a shining example of just this—best practices, lessons learned, approaches and strategies of how librarians, library professionals, and others in academic libraries around the world are successfully providing evidence of their contributions to student academic success and effectively demonstrating their library’s value and worth to institutional administrators and stakeholders There is no “one-size-fits-all” approach

to demonstrating a library’s worth and value, and accordingly our set shares a range

of successful approaches and strategies utilized in different types of academic libraries around the world Our set is also an opportunity to provide guidance and support

to many of you—librarians, library professionals, and others involved in library assessment—who struggle to find the best approach and strategy at the right time in your assessment journey Our set will help you build your knowledge in this area and teach you how to successfully articulate your academic library’s value by following and selectively applying the many replicable and practical strategies and approaches shared

in the forty-two case studies contained herein

The Uniqueness of Our Set

This is not the first book (or set of books) to be published on library assessment, nor will it be the last, but we wanted ours to have a unique place in this pool We carefully sought unique characteristics or features that we believe set ours apart from others

in this space These distinctive features include our international representation; our selection of case studies illuminating thought-provoking, insightful, practical, and replicable approaches and strategies to library assessment; the accessible structure and convenient organization of each book; and our summary profiles We believe these make this set unique, more accessible, pragmatic, and a must-have for anyone involved

in assessment in academic libraries worldwide You will ultimately be the judge in determining whether we have achieved this goal We elaborate on each of these four features in the remainder of this introduction

we believe this set will and should have an international appeal

Trang 20

Demonstrating Value through Library assessment xix

Our Selection of Cases

We received an overwhelming response of interest from authors who submitted proposals for consideration for inclusion in this set Our intake comprised a wide range of high-quality case studies employing a variety of methodologies and aspects of librarianship, such as collections, services, facilities, information literacy, and program reviews We carefully selected those cases that we thought were creative and innovative but also replicable and practical and would appeal to a host of individuals involved in different types of academic libraries across the globe Selecting the final chapters to include in this set was no easy task, but we believe the assortment, innovation, and quality are the right combination We believe you will agree!

Organization and Structure of This Set

How is this set organized and structured? We have organized the forty-two chapters of case studies into four distinct sections

Each chapter begins with a chapter summary profile (which is described in the next section) For consistency, ease of reading, and comparative purposes, each chapter follows

a similar structure that includes four foundational elements: context, communicating results and impact, leveraging the findings, and reflections In addition, authors have added other appropriate content and elements depending on their particular topic

Each of the four sections captures case studies that reflect a different stage of an academic library’s assessment journey in terms of time, resources, and expertise Collectively, the titles of the four sections employ a metaphor indicative of these stages, namely Seeding the Initiative; Low-Hanging Fruit; Reachable Fruit; and Hard-to-Reach Fruit Our intent was to make it easier for you to connect and relate to case studies depending on your particular point in your assessment path and the extent of your access to resources, funding and expertise

Volume One

• Section 1: Seeding the Initiative The eight case studies in this first section

explore the planning stages or “works-in-progress” in assessment that relate to the academic library’s impact and value The results of these efforts may not be imminent Nevertheless, these case studies demonstrate the potential value and the importance of the initial design and planning stage

• Section 2: Low-Hanging Fruit These eleven case studies offer stories of

assessments that are easy to measure, short-term (less than one year),

low-cost, require few resources (staff or tools), and are easily replicable at similar academic libraries

Volume Two

• Section 3: Reachable Fruit (with some effort) This section provides sixteen

case studies that require more external and internal resources to measure,

Trang 21

xx INTRODuCTION aND CONTExT

typically take more than six months to one year to collect and analyze, feature medium costs and resources (i.e., incentives, equipment, tools), and are

replicable at other academic libraries that are similar in size or scope

• Section 4: Hard-to-Reach Fruit The seven case studies in this section include

a range of assessment activities that are more difficult to measure and time- and resource-intensive, that require long-term data collection (e.g., longitudinal studies that require more than a year to collect a dataset or have measures that require more time, such as measuring a cohort’s graduation rates), and that feature greater external partnerships, internal infrastructure, or additional resources to measure and analyze

Enjoy the read!

Marwin and Kirsten (Co-Editors)

Notes

1 R H Orr, “Measuring the Goodness of Library Services: A General Framework for Considering

Quantitative Measures.” Journal of Documentation 29, no 3 (1973): 318

2 Thomas G Mortenson, “State Funding: A Race to the Bottom,” Presidency, 15, no 1 (Winter 2012):

De-Paper No. 15566, National Bureau of Economic Research, Cambridge, MA, December 2009, http://

www.nber.org/papers/w15566; Mortenson, “State Funding”; D Shapiro, A Dundar, P

Wakhun-gu, X Yuan, and A Harrel, Completing College, Signature Report No 8a (Herndon, VA: National

Student Clearinghouse Research Center, February 2015).

7 Kysie Miao, “Performance-Based Funding of Higher Education: A Detailed Look at Best Practices

in 6 States,” Center for American Progress, Washington, DC, August 2012, 1, canprogress.org/wp-content/uploads/issues/2012/08/pdf/performance_funding.pdf

8 National Conference of State Legislatures, “Performance-Based Funding for Higher Education,”

Trang 22

Demonstrating Value through Library assessment xxi

National Conference of State Legislatures, Denver, CO, July 31, 2015,

http://www.ncsl.org/re-search/education/performance-funding.aspx.

9 Mary B Ziskin, Don Hossler, Karyn Rabourn, Osman Cekic, and Youngsik Hwang,

Out-comes-Based Funding (Toronto: Higher Education Quality Council of Ontario, 2014), http://www.

heqco.ca/SiteCollectionDocuments/Outcomes-Based%20Funding%20ENG.pdf

10 “About ACRL,” Association of College and Research Libraries, accessed February 7, 2018, http:// www.ala.org/acrl/aboutacrl.

11 ACRL Research Planning and Review Committee, “2012 Top Ten Trends in Academic Libraries: A

Review of the Trends and Issues Affecting Academic Libraries in Higher Education,” College and Research Libraries News 73, no 6 (June 1, 2012): 311–20; ACRL Research Planning and Review

Committee, “2014 Top Trends in Academic Libraries: A Review of the Trends and Issues Affecting

Academic Libraries in Higher Education,” College and Research Libraries News 75, no 6 (June 1,

2014): 294–302

12 Association of College and Research Libraries, ACRL Plan for Excellence (Chicago: Association of

College and Research Libraries, 2011, revised 2017), 1, gicplan/stratplan

13 Association of College and Research Libraries, Value of Academic Libraries Toolkit (Chicago:

As-sociation of College and Research Libraries, October 2010), http://www.ala.org/acrl/issues/value/ valueofacademiclibrariestoolkit

14 Association of Colllege and Research Libraries, ACRL Value of Academic Libraries website,

ac-cessed February 12, 2018, http://www.acrl.ala.org/value/.

15 See Kara J Malenfant, Lisa Janicke Hinchliffe, and Debra Gilchrist, “Assessment as Action

Re-search: Bridging Academic Scholarship and Everyday Practice,” College and Research Libraries

77, no 2 (March 2016): 140–43, https://doi.org/10.5860/crl.77.2.140; Kara J Malenfant and Karen

Brown, Creating Sustainable Assessment through Collaboration, Occasional Paper No 31 (Urbana,

IL: University of Illinois and Indiana University, National Institute for Learning Outcomes ment [NILOA], November 2017), http://learningoutcomesassessment.org/documents/Occasion- al_Paper31.pdf.

16 Association of College and Research Libraries, Assessment in Action: Libraries and Student cess website, accessed February 12, 2018, http://www.ala.org/acrl/AiA.

17 “About,” Association of Research Libraries, accessed February 7, 2018, http://www.arl.org/about# VQHAco5v8g4

18 “Statistics & Assessment,” Association of Research Libraries, accessed February 7, 2018, http://

www.arl.org/focus-areas/statistics-assessment#.VRBaBo7F98E

19 ACRL Research Planning and Review Committee, Environmental Scan 2017 (Chicago: Association

of College and Research Libraries, March 2017), 38, http://www.ala.org/acrl/sites/ala.org.acrl/files/ content/publications/whitepapers/EnvironmentalScan2017.pdf

20 ACRL Research Planning and Review Committee, Environmental Scan, 39.

Bibliography

ACRL Research Planning and Review Committee Environmental Scan 2017 Chicago: Association of

College and Research Libraries, March 2017 http://www.ala.org/acrl/sites/ala.org.acrl/files/

content/publications/whitepapers/EnvironmentalScan2017.pdf.

——— “2012 Top Ten Trends in Academic Libraries: A Review of the Trends and Issues Affecting

Ac-ademic Libraries in Higher Education.” College and Research Libraries News 73, no 6 (June 1,

2012): 311–20.

——— “2014 Top Trends in Academic Libraries: A Review of the Trends and Issues Affecting Academic

Libraries in Higher Education.” College and Research Libraries News 75, no 6 (June 1, 2014):

294–302.

Association of College and Research Libraries “About ACRL.” Association of College and Research

Libraries Accessed February 7, 2018 http://www.ala.org/acrl/aboutacrl.

——— ACRL Value of Academic Libraries website Accessed February 12, 2018 http://www.acrl.ala.org/

value/.

Trang 23

xxii INTRODuCTION aND CONTExT

——— ACRL Plan for Excellence Chicago: Association of College and Research Libraries, 2011, revised

Bound, John, Michael Lovenheim, and Sarah Turner. “Why Have College Completion Rates Declined?

An Analysis of Changing Student Preparation and Collegiate Resources.” NBER Working Paper

No. 15566 National Bureau of Economic Research, Cambridge, MA, December 2009 http:// www.nber.org/papers/w15566.

Malenfant, Kara J., and Karen Brown. Creating Sustainable Assessment through Collaboration: A National

Program Reveals Effective Practices Occasional Paper No 31 Urbana, IL: University of Illinois

and Indiana University, National Institute for Learning Outcomes Assessment (NILOA), vember 2017 http://learningoutcomesassessment.org/documents/Occasional_Paper31.pdf Malenfant, Kara J., Lisa Janicke Hinchliffe, and Debra Gilchrist “Assessment as Action Research: Bridg-

No-ing Academic Scholarship and Everyday Practice.” College and Research Libraries 77, no 2

(March 2016): 140–43 https://doi.org/10.5860/crl.77.2.140.

Miao, Kysie “Performance-Based Funding of Higher Education: A Detailed Look at Best Practices in 6

States.” Center for American Progress, Washington, DC, August 2012 progress.org/wp-content/uploads/issues/2012/08/pdf/performance_funding.pdf.

https://cdn.american-Mitchell, Michael, Michael Leachman, and Kathleen Masterson “Funding Down, Tuition Up: State Cuts

to Higher Education Threaten Quality and Affordability at Public Colleges.” Center on Budget

and Policy Priorities, Washington, DC, last updated August 15, 2016 https://www.cbpp.org/

Shapiro, D., A Dundar, P Wakhungu, X Yuan, and A Harrel Completing College: A State-Level View of

Student Attainment Rates Signature Report No 8a Herndon, VA: National Student

Clearing-house Research Center, February 2015.

Weston, Liz “OECD: The US Has Fallen Behind Other Countries in College Completion.” Business

Insider, September 9, 2014

http://www.businessinsider.com/r-us-falls-behind-in-college-com-petition-oecd-2014-9.

Ziskin, Mary B., Don Hossler, Karyn Rabourn, Osman Cekic, and Youngsik Hwang Outcomes-Based

Funding: Current Satus, Promising Practices and Emerging Trends Toronto: Higher

Educa-tion Quality Council of Ontario, 2014 http://www.heqco.ca/SiteCollecEduca-tionDocuments/Out- comes-Based%20Funding%20ENG.pdf.

Trang 24

This third section, Reachable Fruit (with some effort), as the section title suggests, provides sixteen case studies of doable projects in academic libraries that may require more external and internal resources to measure The value of these studies is that they demonstrate the replicability of projects that take six months to one year to collect and analyze at academic libraries similar in size and scope to one’s own library Projects typically feature medium costs and resources (i.e., incentives, equipment, tools).

This section comprises chapters covering a broad and diverse spectrum of practical topics on collection, qualitative, and instructional assessment Chapter 33 (Turner) describes the relationship between assessment and organizational change Examples of collection assessment include a project to develop collections for new academic programs (chapter 35, Westbrooks and Barnett-Ellis) and an electronic resources assessment for a university transitioning from a teaching to a research curriculum (chapter 28, Ho) A number of chapters present projects using qualitative research methods, such as chapter 34 on faculty perceptions of student research (Valentino, Walsh, and Barlow) and chapter 27 utilizing a mixed-methods approach to measure the effectiveness of collaboration workshops (Hines, de Farber, and LeDuc) Instructional assessment projects cover fresh topical ground, such as

piloting the new Threshold Achievement Test for Information Literacy (TATIL) to

measure student learning outcomes of seniors (chapter 21, Carter and Good) and applying nudge theory to conduct microassessments for library instruction (chapter

22, Chaffin) Authors with an international perspective (Ireland, Nigeria, Singapore, and United Arab Emirates) are represented in this section as well

Trang 25

Each chapter is prefaced by a one-page summary profile The purpose of this summary profile is to give you a quick overview of the chapter by providing fourteen salient items of descriptive information for the case study The summary profile page will help you decide if the subsequent case study is of interest to you and thus warrants

a deeper investigation and thorough read

We hope you enjoy the variety of sixteen academic library case studies in this third section, Reachable Fruit (with some effort), representing feasible assessments that require more external and internal resources to complete

Trang 26

Section 3

Title: Problems and Promises of Using LMS Learner

Analytics for Assessment: Case Study of a First-Year

English Program

Abstract: Learning management systems (LMS) are widely used in education They

offer the potential for assessing student learning, but the reality of using them for

this is problematic This case study chronicles efforts by librarians at Marquette

University to use LMS data to assess students’ information literacy knowledge in

Marquette’s first-year English program

Discussion covers:

• the development, implementation, and analysis of effectiveness of an online

tuto-rial on students’ learning

• the difficulties involved in trying to use LMS data for performance assessment

instead of surveys or quizzes

• the impact of inadequate LMS reporting tools on instructors’ willingness to use

the tutorial

Keywords: assessment, learning management system (LMS), information literacy,

embedded librarianship, learner analytics

Project focus: assessment methodologies, techniques, or practices; information

literacy assessment; organizational practices (i.e., strategic planning); user behaviors

and needs; data use and technology; assessment concepts and/or management

Results made or will make case for: proof of library impact and value,

improve-ments to the tutorial

Data needed: LMS SCORM data from tutorial

Methodology: qualitative, quantitative, mixed method

Project duration: greater than 1 year

Tool(s) utilized: Articulate Storyline, LMS, Qualtrics, Excel, SPSS; instructional

designer, statistics consultant; staff time for IRB informed consent, data collection,

Trang 28

Problems and Promises of using LMS Learner analytics for assessment 307

A core aim of academic libraries is to help students develop IL competencies

so that they are equipped with the skills to be proficient swimmers and capable

of more than just treading water in the expanding digital ocean of information Although tests lack similarity to real-world situations,2 standardized tests tend to

be the most frequently used data collection method to assess IL skills.3 Despite IL being considered a crucial skill for success in higher education and in life,4 there are relatively few instruments available to assess this set of skills; ETS recently discontinued its iSkills test.5 There are at least two multiple-choice tests available commercially that have been shown to provide a reliable and valid way of measuring

IL.6 Other approaches used to assess IL skills include information search tasks, portfolios, analysis of term paper bibliographies, and use of integrated approaches based on several instruments.7 A final type of assessment approach is performance assessment,8 which require students to do more than choose an answer from among several options According to Leichner and colleagues, performance assessment is seen as a way to assess complex competences instead of factual knowledge, which is

at the core of information literacy skills.9 While performance assessment is a better way to assess complex competences, standardized tests are easier to administer and assess However, by leveraging embedded librarianship (the presence of librarians in

an LMS) and the learner analytics accessible in an LMS, libraries have an opportunity

to more easily implement performance assessment

Mattingly, Rice, and Berge define learner analytics as a focus on how students access information, how they navigate through materials, how long it takes them to complete activities, and how they interact with the materials to transform the information into measurable learning.10 One way libraries can access data for learning analytics is through a campus LMS Leeder and Lonn state that LMS adoption in higher education institutions has been rapid and widespread.11 The LMS also tracks a variety of data about the students, their progress, and their interactions in the online course Several studies have explored the relationship between this data and students’ performance: for example, how accessing supplemental online resources benefits undergraduates; the link between LMS activity and student grades; and the significant relationship between time spent in the LMS and grades, especially for students who obtained grades between D and B.12 Other researchers have examined how analyzing student discussion post responses can yield understanding of student interaction patterns.13 However, Ifenthaler and Pirnay-Dummer argue that the use of an LMS is usually limited to only

a few technological features and that utilizing more features of the LMS can promote meaningful learning environments.14 More recently, Alamuddin, Brown and Kurzweil discuss the continuing difficulties in adopting learning analytics on a large scale, including the problem of pulling data from multiple platforms and finding ways to analyze it.15

In the library profession, embedded librarianship became more visible in the mid-2000s.16 According to Tumbleson and Burke it allows librarians, through their presence in the LMS, to be as close as possible to where students are receiving their assignments and experiencing instruction.17 By having access to the LMS site of

Trang 29

in administration and management of the LMS, the difficulty of negotiating with faculty for permission and access to a course site, and lack of a pre-existing librarian-specific roles in the LMS.18

The drawbacks of one-shot instruction sessions lamented by librarians since the 1960s continue to persist today.19 Tumbleson and Burke contend that utilizing an LMS can remedy some of these drawbacks.20 A number of case studies in the literature describe how librarians have utilized an LMS to address an instructional need.21 However, these case studies focus mainly on custom creation of research guides or links to subject-specific databases and do not offer research into the usage patterns of the tools by actual users except through webpage view counts In their chapter on assessing the impact of embedded librarians in an LMS, Tumbleson and Burke add a caveat that even basic LMS usage statistics are generally limited to how many times a given resource has been opened and that the shallowness of this information does not give the librarian a good picture of student activity among the embedded resources in the LMS.22

One way to address this issue is to develop a tutorial, or suite of tutorials, that allow librarians to utilize the LMS to gather more detailed information about student performance toward achieving a certain IL skill or concept Tutorials of this type can

be developed using the sharable content object reference model (SCORM) A tutorial developed using SCORM can “talk” with any SCORM-compliant LMS, which includes most major LMSs With the advent of rapid e-learning development tools such as Adobe Captivate and Articulate Storyline, it is now possible to develop highly interactive, SCORM-compliant modules without advanced programming knowledge

Context at Marquette University: The

First-Year English Program

Since 1980, the Marquette Raynor Memorial Libraries have supported and participated

in the Marquette English department’s first-year English (FYE) program For many years, this meant that up to 75 percent of incoming first-year students came to the library for a one-shot fifty- or seventy-five-minute workshop that addressed a specific assignment in the first semester freshman English course and introduced them to the Marquette Library resources and services In fall 2013, this collaboration began a new phase when the library developed and incorporated online information literacy modules into the program During the summer of 2013, a small team of librarians worked with the libraries’ instructional designer to develop a suite of SCORM modules The team talked with FYE instructors and other librarians to determine competencies

Trang 30

Problems and Promises of using LMS Learner analytics for assessment 309

students would need and had struggled with in previous semesters After determining the learning outcomes for the modules and reviewing the literature on developing effective e-learning, the team employed three concepts during module development: (1) segmenting of lessons, (2) use of conversational style, and (3) incorporating practice opportunities

According to Clark and Mayer, it is important to break e-learning lessons into manageable parts and resist the temptation to develop a “kitchen sink” product.23

Following this guideline, the development team decided on discrete modules for each key IL topic instead of a larger, comprehensive tutorial that addressed IL skills

as a whole Having the IL concepts or tasks in discrete modules allowed them to be more easily integrated into instructor lesson plans For example, some instructors may not feel that students need a lot of citation help, but do need practice on narrowing a topic Keeping content in separate modules also ensures students won’t be confused by unnecessary information Research supports the idea that the tone and style of writing in an e-learning module impacts its effectiveness, and

in particular, conversational style should be used over formal style.24 To ensure students better retain the information from an e-learning module, students need

to apply their understanding of the concept to an actual example, which requires

a deeper level of processing than a multiple-choice test question.25 However, they note that there is a paradox to practice: it must be deliberate practice (1) that focuses on a specific skills gap, (2) for which explanatory corrective feedback is given, and (3) that builds skills that transfer from the learning environment to the real environment

The paper prototyping method espoused by Snyder was used to create initial versions of all modules,26 and production versions were created in Articulate Storyline and exported as SCORM packages A website with demo versions of the packages as well as instructions on how to install the modules and review student submitted data was developed as a support resource for faculty and librarians and can be viewed at http://mu.edu/library/lor/first-year-english/

These modules were used in several ways in the FYE program: as student homework prior to the workshops (a flipped instruction model) or as in-class activities, and as study or review materials available within Marquette’s LMS The libraries collaborated with Marquette’s Information Technology Services department, which administers the LMS, to have a special librarian role created in the LMS Librarians are automatically enrolled in the LMS course sites they are working with The librarian role provides the same level of functionality as an instructor role, that

is, ability to post materials and create discussion forums, surveys, and quizzes, with the exception that the librarian role does not have the ability to view or assign grades For some librarians, enrollment in the LMS course simply offers an easy avenue for students to contact their class librarian In other sections, and often dependent on the relationship between the instructor and librarian, the librarian facilitates online discussions, responds to student research drafts, or posts library-related content to their course page To ensure a minimum standard of library familiarity for all students,

Trang 31

310 ChaPTEr 20

there was a requirement that each section devote one class period to a “research day” with their librarian

To maximize the class time librarians had with students during the research day,

a flipped classroom model supported by interactive online modules was encouraged Instructors from a variety of disciplines cite strengths of the flipped model, including efficient use of class time, more active learning opportunities for students, increased one-on-one interaction between student and teacher, student responsibility for learning, and addressing multiple learning styles.27 Several studies have shown that online interactive modules can be just as effective as in-person classes.28

After a successful pilot in fall 2013, in fall 2014 individual instructors and librarians negotiated which modules to load into the LMS course sites In fall 2015, the entire suite of SCORM modules was loaded in draft mode into the LMS course sites of all seventy-seven sections of FYE by the libraries’ instructional designer, and then the instructor and librarian decided which modules to make visible Although there were eight SCORM modules, FYE instructors were encouraged by their librarian and the FYE program faculty director to assign only the Introduction to Academic Research module to their students Instructors were encouraged to ask students to complete the module prior to the in-class research day With the module embedded directly into D2L (the LMS used at Marquette), both librarians and instructors could view student completion rates, as well as read the open-text responses to the practice search activity This information allowed librarians the possibility of tailoring their instruction to the students’ demonstrated ability with IL concepts and skills; it gave instructors the possibility of awarding points for completion of the module With many introductory elements of instruction presented and available for review, librarians had several options for how to direct class time Some started by opening the discussion with questions raised by students’ experience with the module and then segued into more complex examples and sophisticated search strategies Some allowed for peer-led instruction, having students demonstrate or describe for the class how they began their search, and others used the time for higher-level discussions of evaluating resources, including how to find information on a news publication to help determine credibility and bias

In fall 2014, the authors collected data from the Introduction to Academic Research tutorial and analyzed it with the sole intent of determining which parts of the tutorial worked well and which did not This led to revisions in the language used in the tutorial and in the layout of some screens In fall 2015, after getting institutional review board approval, one instruction librarian coordinated with three other instruction librarians

to visit their seventeen sections of FYE, to provide information about the research study, and to distribute consent forms to the students Table 20.1 shows the broader context of the sample of 177 students who were recruited for this study, as well as the how widely the tutorial was deployed or visible and used

Trang 32

Problems and Promises of using LMS Learner analytics for assessment 311

Table 20.1

Student Participation

# of Students

% of Total

Students enrolled in 17 sections given informed consent presentation 301 22.1

Students enrolled in 17 sections who signed consent forms for sharing

Students in 17 sections who completed tutorial and shared their data 177 13.0

Of the likely 301 students who heard the informed consent presentation, approximately 93 percent (280) consented to the sharing of their data (it is not known

if all were present on the day of the presentation), though in the end only 68.8 percent (207) of the 301 students completed the tutorial The 177 students who both completed the tutorial and consented to share their data represent 13 percent of all students enrolled in the course

Upon completion of the library research day, the libraries’ instructional designer accessed the D2L course sites of the participating sections and pulled the SCORM data from student submissions for the Introduction to Academic Research module D2L does not offer an “easy button” to export SCORM data as an Excel or even CSV file SCORM data is shown only in an HTML table inside the SCORM Reports page of D2L

To get the SCORM data in a usable format, the instructional designer had to engage in data cleanup, entailing the use of the Firefox add-on “Copy as Plain Text” to get a clean copy of the data from D2L Then the text was pasted into Excel, where nonessential data, such as weighting, was removed and desired data, such as the student name and responses, was kept To ensure student anonymity, an eight-character alphanumeric code was randomly generated to replace the student names Once the Excel spreadsheet contained the responses from the 177 students, the Directory feature in Microsoft Word’s Mail Merge function was used to pull the responses from the Excel spreadsheet into a single document containing the listing of student responses that made reviewing and coding the responses much easier

Analyzing the Tutorial Submissions

The Introduction to Academic Research tutorial is comprised of four parts:

1 a brief video explaining different source types by comparing them to the different types of maps one would use in different scenarios (finding a restaurant vs finding elevation change),

2 a brief textual explanation of the differences between a library database and

an internet search engine,

Trang 33

The interactive search assignment was the part of the tutorial with performance assessment, and the data from that activity was the basis for this analysis SCORM data collected from the LMS was comprised of the following data submitted by students:

• Student topic, or research question: short sentences or questions describing

their topics (Students were asked to work on a topic or question of their own choosing; the intent was to better engage their interest so that they would work

on the tutorial seriously.)

• Search statement elements: student choices of two keywords for their topic

and a Boolean command

• Article identification elements: article title, journal title, date of publication.

• Reflection: analysis of student responses to three open-ended reflection

prompts

Utilizing guidelines in Saldaña’s Coding Manual for Qualitative Researchers, two

codebooks were developed before reviewing the 177 submissions.29 One codebook focused on the more concrete data elements (the search statement and article information), and the other codebook focused on the reflections Initial development of the codebooks involved reading the responses through two to three times and making note of recurring phrases, problems, and topics, and then trying to organize them

into broader themes or groups Developing the codebook (see appendix 20.1) was an

iterative process that necessitated several meetings between the researchers and sample coding of ten randomly pulled reflection submissions Development of a codebook for the search statement and article information was fairly straightforward; in contrast, the reflection codebook development was much more difficult All reflections were coded twice because the results of the first coding, using the third iteration of the reflection codebook, convinced the authors that yet another codebook revision was necessary The fourth version of the reflection codebook was considerably compressed: sixty-seven codes were reduced to thirty-seven Once the codebook was set, it was exported

to a Qualtrics form to make capturing the coded data easier and ensure the coded data could more easily be exported in a format that would be compatible with SPSS For all items except the reflection, the codes were single answers (radio buttons); for the reflections, the codes were multiple answers (check boxes) After the data was coded, the authors worked with Marquette’s Assessment Director Sharron Ronco on the statistical analysis, which was comprised of occurrence or frequency tables for all codes and tables for coder divergence

Two coders coded each student response Saldaña provides a summary review of the literature on rationales for coding collaboratively,30 which range from ensuring

Trang 34

Problems and Promises of using LMS Learner analytics for assessment 313

multiple viewpoints, interpretations, and potentially a better analysis, to simple sharing

of the labor For this research project, coding collaboratively simply seemed a logical extension of the building of the codebook One drawback of coding collaboratively, without the aid of software to ensure inter-coder consistency (e.g NVivo), is that the coding is not always consistent One solution might have been to revise the codebook once more, with tighter definitions, and to recode the data yet again The researchers employed a solution suggested by Marquette’s Assessment Director: on responses where codes diverged, a new n.5 code category was created for the analysis, called “split decision.” For example, a 2.5 code means that the response was coded as 2 by one coder, and as 3 by the other coder The justification for doing so is that the main focus of the analysis is and always was meant to be on the students’ responses, not whether or not the two coders were completely in agreement

Tutorial Submissions: Major Findings

The general conclusions the authors draw from analyzing the tutorial submissions are summarized in table 20.2 More detailed discussion follows

Table 20.2

Major Findings

Search statement construction

• 90% of students chose appropriate Boolean commands.

• 55% of students chose good keywords for their research topics.

• 35% of students chose poor keywords.

Publication title recognition

• 60–70% of students clearly recognized journal/magazine titles in database records.

• 21–30% of students had difficulty recognizing journal/magazine titles in database records.

Themes that students wrote about

• 65% described what they did when searching.

• 60% mentioned relevance/irrelevance.

• 31% mentioned specificity.

• 22% mentioned credibility.

• 20% mentioned evaluation.

Subject Consistency: Student Research Question,

Keywords, Article Title

To determine if the students were working on the tutorial in good faith, the authors looked at subject consistency among the responses for student topic, two keywords, and the article title If at least two to three of the items entered were on one topic—that

is, clearly related—the authors considered that the student was working in good faith Data in table 20.3 demonstrates that 95 percent of student responses included at least some subject consistency (ratings 3, 3.5, or 4), and 71 percent of responses were coded

Trang 35

314 ChaPTEr 20

4 (4 or 3.5) and have a strong level of subject consistency From this code, the authors inferred that 95 percent of the students worked on the tutorial in good faith This code was a validity check

Table 20.3

Subject Consistency in Responses, a Validity Check

a One response was inadvertently skipped by one coder for this part of the analysis

Search Statement Elements: Keywords

To see how well students could decide on keywords for their research topic, the coders rated the student keyword choices as: poor choices—not on topic; technically functional but not the best; and good choices An example of a response that was coded

as “functional, but not the best” is “obesity AND world” for the topic statement, “Is obesity increasing around the world?” The search terms are functional in that they will return some usable results, but there are more effective keywords one could use to get better results An example of a good choice would be “women AND refugees,” for the research topic “How are women refugees treated in comparison to men refugees?” The addition of an additional keyword would certainly make this search more effective; however, the tutorial provided space for only two keywords (This was a result partly

of screen layout constraints, and partly of the fact that two keywords often are enough

to start a database search.) Results in table 20.4 show that 55 percent of students chose their keywords well (as coded by at least one coder, i.e., both the 3 and 2.5 codes added together); 35 percent of students chose keywords that would yield some good results, but their choices were not the best; while 10 percent of students made poor choices

Table 20.4

Search Statement Construction—Keyword Choices

Trang 36

Problems and Promises of using LMS Learner analytics for assessment 315

Search Statement Elements: Boolean Command Choices

Analysis of student’s choice of Boolean commands consisted of the following ratings—poor choice; functional technically but not the best; and good choice An example of a poor choice would be “Genetically Modified Foods NOT Outside of America” for the

topic “Do genetically modified foods hinder american [sic] health?” An example of

functional technically but not the best would be the selection of keywords and command

“Psychological Disorders OR Refugee Health” for the question “Are psychological disorders a common occurrence in refugee health?” The results of the analysis are show

in table 20.5 A key takeaway is that 90 percent of students chose the functionally most correct command

Table 20.5

Search Statement Construction—Boolean Command Choice

Article Identification: Publication Title

There were five choices for the code for recognition of publication titles: not a serious answer; not from a database; not a source title; probably popular, trade, or news;

and probably scholarly The modifier probably was added because, though for some

publication titles the authors did research the publication title, they did not verify the publication or its type for all publication titles Some titles were already known

to the authors, so they could label it from prior knowledge; for other titles they made

educated guesses as to the type (e.g The New York Times is a well-known newspaper;

JAMA is a well-known scholarly medical journal; and a good guess for Journal of the XYZ Association is that it’s scholarly) Findings for this code are shown in table 20.6 and

demonstrate that 60 percent of students, and possibly as many as 70 percent (if the 3.5 split decision codes are included), could correctly identify their publication or source title in the database record More than half, 52 percent (if the 4.5 split decision codes are included), chose scholarly articles for their examples, and this happened without any explicit instructions to do so As many as 30 percent (if the 3.5 split decision codes are included) were not able to identify the publication title for their article Instead these students typed in phrases such as “journal article,” “academic journal,” or “Ebsco.”

While it may be comforting to think that students have already learned to recognize and value scholarly sources over popular, it’s also possible that given the nature of their search topics (many were health-related), it may simply be that more scholarly sources

Trang 37

316 ChaPTEr 20

were retrieved The fact that 21 percent, possibly as many as 30 percent of students (if the 3.5 split decision codes are included), were not able to identify the publication or source title for their article in a database record confirms the anecdotal experience of many librarians working at a reference desk: understanding how to decipher a database record is a problem for many students

The analysis of the results for this code provided a second validity check on students’ good faith efforts Compared to the subject consistency code discussed earlier, where the authors estimate that 5 percent did not work in good faith, a slightly higher percentage,

9 percent, did not use the database to locate their sample article (e.g., www.cbsnews.com was entered as a publication title) Alternate explanations for why students did not use the database could include that they did not understand the instructions or did not see or understand the link in the tutorial that would open a new browser window with the database in it

Table 20.6

Recognition of Publication Titles

• What did you do after entering your initial search statement?

• What challenges did you have?

• What did you learn about the academic research process?

In the fourth iteration of the reflections codebook, the major code groups drawn from the student reflections were

• Volunteered information: about past experience, emotional expression,

Goo-gle comparison

Trang 38

Problems and Promises of using LMS Learner analytics for assessment 317

• Response contents: basic description of items mentioned by the student E.g.,

instances of the student mentioning or describing: Boolean commands,

key-words, search results, research question / topic; credibility, efficiency,

specifici-ty; evaluation or evaluation criteria / process

• Specific details about database searching (e.g., searching, features, limits,

etc.) These tend to be found in longer responses

Reflection Code Group: Volunteered Information

There were too few responses coded for volunteered information to analyze them in any depth with statistical reliability Although these topics were not specifically addressed

in the reflection prompts, table 20.7 shows that nearly 20 percent of students voluntarily addressed them (Since 142 responses, or 80.23%, were coded N/A for this code, the authors deduce that 35 responses, or 19.77%, received this code.) Sample student responses included

• Past experience: “I’ve done this stuff before,” “I’ve never seen resources like

these before.”

• Emotional expression: “This made me anxious,” “I was relieved at how easy it

was.”

• Google comparison: “This is harder/easier/faster/more reliable than Google or

a web search engine.”

Table 20.7

Volunteered Information in Reflections

Least One Coder

Reflection Code Group: Response Contents

Coders could potentially have selected almost all coding choices for the response contents theme, which makes interpreting the results more complicated Table 20.8 shows the topics that students wrote about most; the topics were predictable as they reflect the first two prompt questions Students mentioned their research question

or topic, relevance, keywords, and search results, all directly related to the task they worked on in the tutorial, database searching Students used several words in ways that were ambiguous; context often gave a likely meaning, but not always For example,

“links” could mean articles or results; “options” could mean articles, results, or database limit features; and “sites” could mean databases or results

Trang 39

318 ChaPTEr 20

Table 20.8

Response Contents in Reflections

Coded by at Least

1 Coder

% of Total Responses (177)

Described search results mechanically

46.33

Described search results qualitatively

(e.g., good/bad; relevant/not relevant)

Reflection Code Group: Database Searching Specifics

The codes in table 20.9, on database searching specifics, evolved through the various iterations of the codebook They relate to phrases or topics that recurred regularly, which also fall into the broader codes in the response contents code group from table 20.8, but include more detail They tend to indicate a lengthier response from the student Not quite half of responses received one of these codes (48.59%) Perhaps this code group can also serve as a rough estimate for how many students were willing to write more than the bare minimum in their responses

Trang 40

Problems and Promises of using LMS Learner analytics for assessment 319

Table 20.9

Database Searching Specifics

by at Least 1 Coder

% of Total (177)

Communicating Results and Impact

Summary findings from the analysis were shared with the department head of the library’s instruction department, the English instructors who allowed the authors

to recruit their students for the study, and the FYE program faculty director The information was also shared at a presentation during the 2016 Wisconsin Association

of Academic Librarians Conference The data demonstrated how an e-learning tutorial and embedded librarianship can provide a means to measure information literacy skills and understanding in students The tutorial submissions also provided some guidance for librarians wishing to tailor the precious time of their one-shot instruction sessions

to better meet the skills gap of their students

The process of extracting the data for analysis also revealed how lack of a usable interface for viewing the LMS SCORM data can be a major hindrance to the tutorial being used by faculty and librarians These revelations led the researchers to develop a more user-friendly SCORM report using Microsoft Word and Excel The authors hope that by sharing our workflow as well as the code for the tutorials, other libraries may use our case study as a template to assess and improve information literacy instruction and measurement at their own institutions

Leveraging the Findings

The findings from this study were very useful to the library The initial analysis of the fall 2014 data guided revisions to the tutorial The fall 2015 analysis gave a more detailed view of student performance with IL tasks and concepts and an understanding of the pitfalls and limits of the LMS and SCORM environment As universities are pressed

by accrediting bodies for more programmatic assessment efforts, this type of tutorial could be a solution

Ngày đăng: 04/08/2019, 10:48

TỪ KHÓA LIÊN QUAN

TÀI LIỆU CÙNG NGƯỜI DÙNG

TÀI LIỆU LIÊN QUAN

w