Serviceeinschränkungen vom 12.-22.02.2026 - weitere Infos auf der UB-Homepage

Treffer: Experiential Learning about Database Design in the MLIS Curriculum: Software and Instructional Design for Threshold Knowledge

Title:
Experiential Learning about Database Design in the MLIS Curriculum: Software and Instructional Design for Threshold Knowledge
Language:
English
Source:
Journal of Education for Library and Information Science. 2025 66(3):203-224.
Availability:
Association for Library and Information Science Education. Available from: University of Toronto Press. 5201 Dufferin Street, Toronto, ON, M3H 5T8 Canada. Tel: 416-667–7929; Fax: 416-667–7832; e-mail: journals@utpress.utoronto.ca; e-mail: office@alise.org; Web site: https://www.utpjournals.press/loi/jelis
Peer Reviewed:
Y
Page Count:
22
Publication Date:
2025
Document Type:
Fachzeitschrift Journal Articles<br />Reports - Research
Education Level:
Higher Education
Postsecondary Education
Geographic Terms:
DOI:
10.3138/jelis-2023-0054
ISSN:
0748-5786
2328-2967
Entry Date:
2025
Accession Number:
EJ1477693
Database:
ERIC

Weitere Informationen

With almost all MLIS and similar degree programs having a required course covering database design, it is critical to develop curriculum that engages students in learning essential concepts and how to implement designs. We report on a project that explored how essential knowledge about database design can be taught effectively through experiential learning activities involving collaborative groupwork and using database software to build small databases and search forms. The focus is an existing required introductory course on information retrieval system design, in which students learn about database design concepts and implement their designs in browser-based software, building the database structure, content, and search forms. They learn about critical concepts in database design, including eliciting user needs and requirements, prototyping, teamwork, and iterative testing and evaluation. The project had two aims: to select new database software to use as a learning tool in the course and to redesign course assignments so that learning experiences would be enhanced. The project led to important findings about the student learning experiences, to improvements in the curriculum, and to recommendations for educators involved in similar courses.

As Provided

AN0186751513;lii01jul.25;2025Jul23.04:41;v2.2.500

Experiential Learning About Database Design in the MLIS Curriculum: Software and Instructional Design for Threshold Knowledge 

With almost all MLIS and similar degree programs having a required course covering database design, it is critical to develop curriculum that engages students in learning essential concepts and how to implement designs. We report on a project that explored how essential knowledge about database design can be taught effectively through experiential learning activities involving collaborative groupwork and using database software to build small databases and search forms. The focus is an existing required introductory course on information retrieval system design, in which students learn about database design concepts and implement their designs in browser-based software, building the database structure, content, and search forms. They learn about critical concepts in database design, including eliciting user needs and requirements, prototyping, teamwork, and iterative testing and evaluation. The project had two aims: to select new database software to use as a learning tool in the course and to redesign course assignments so that learning experiences would be enhanced. The project led to important findings about the student learning experiences, to improvements in the curriculum, and to recommendations for educators involved in similar courses.

Keywords: database design; database software; experiential learning; information organization; information retrieval; MLIS curriculum; threshold concepts

Key Points:

Critical knowledge (threshold knowledge) about database design can be taught effectively through experiential learning activities that integrate collaborative group design work and hands-on building of small databases.

Conducting user research in iterative phases is an essential element in curriculum design and an effective approach to developing active learning assignments and self-guided support materials.

A blend of research methods for the user research is most effective when it includes think-aloud strategies and semi-structured interviews in the early phases, followed by questionnaires in the later phases.

Over 82 percent of the largest MLIS degree programs in the United States and Canada include a required course covering information organization and retrieval ([3]; [11]). The course content generally covers essential knowledge about designing databases that serves as a foundation for elective courses in a range of areas, from cataloging and archival metadata to reference services, and on to website design, content management systems, and user research. This project was a deep exploration of how essential knowledge about database design can be taught effectively through experiential learning activities involving hands-on groupwork and database software. The project had two aims for the MLIS program's existing introductory course on information retrieval system design: to replace the database software used as a learning tool in the course and to redesign the assignments to enhance learning experiences. In addition, the new software needed to fulfill new university security requirements. The project team had two researchers (also the authors of this article): the professor who is the course coordinator and a current MLIS student with prior work experience reviewing software in an educational setting and a master's degree in educational psychology. The process of selecting new software and creating the new assignments involved three phases of user research with students that led to useful findings about their experiences, to improvements in the curriculum, and to recommendations for educators involved in similar courses.

Our positioning in undertaking the project was that long-lasting learning involves internalizing concepts through meaningful, hands-on, experiential activities, "process[es] grounded in experience" ([19], p. 29). Courses in the MLIS curriculum are no exception to this. We report on how database software is used as a learning tool in the information retrieval system design course, one of three required courses in a totally online MLIS program, in support of integrative learning. The course prepares students to take elective courses on a diverse range of topics in the domains of design, organization, and retrieval, such as cataloging, library automation, and web usability. The project involved changing from one software program and the simultaneous redesign of course assignments on database design at an introductory level. With course enrollment at around 1,000 students each year, and with 11 different faculty teaching the course, extensive preparation, evaluation, and user testing prior to roll-out were essential.

The project plan needed to (1) ensure that the software we selected would fully meet both curricular needs and learning objectives, and the university's heightened security requirements; (2) support the course's core competencies; and (3) enable a smooth launch in the initial semester (expecting 425–450 students). As part of the project, we evaluated different software, created software tutorials and new assignments, and conducted three phases of user research with students to assess both the assignments and tutorials, refining the materials after each phase.

To begin, we established a rigorous process to inform our decision on the next software to use in the course, starting with developing a list of candidate software programs. The process involved three steps. First, we outlined the feature requirements such as support for non-bibliographic databases, different types of database fields (dropdown lists, text fields, etc.), and word- and phrase-parsing of textual content for search purposes. Second, we conducted an environmental scan of similar introductory MLIS information retrieval and organization courses to discern if their database design–centered assignments used software to provide students with hands-on experience. Third, we selected the strongest software candidates and evaluated them against our requirements. After extensive testing by the two project team researchers, we selected the top candidate for further study. At this point, we had the university's technical team approve our choice as meeting campus security requirements. We then conducted three phases of user research: alpha testing (interviews and think-aloud, n = 4), pilot course testing (n = 58), and questionnaire evaluation with full semester enrolment (n = 435).

Introductory courses in information retrieval have covered database design principles since at least the 1990s ([37]). In more recent years our course has included hands-on implementation of students' designs using database software. Previous software used in the course included InMagic® DBTextworks® and WebData Pro. (DBTextworks was replaced primarily because, at that time, it required Windows, and Mac users needed to install Windows emulation software. WebData Pro needed to be replaced due to the two reasons of meeting the higher security requirements and the user-friendliness needed in an introductory course.) Two database design assignments use software for the implementation and evaluation stages: an individual exercise and a group project. For the exercise, each student begins with a pre-set design of a database with four fields that they must implement in the software, then build records, and, finally, create a search form and share it with others in the class. The exercise serves as a warm-up for the larger group project that follows. The project assignment is designed to simulate the trajectory of a product lifecycle so that students have the opportunity to experience design and development stages ([33]). Similarly, the stages align with design thinking stages, namely empathize-define-ideate-prototype-test-implement ([7]; [14]), and students learn about the importance of iteration when designing products. Each group writes a statement of purpose and develops a data structure, both of which guide their design decisions and implementation going forward. (Another preparatory exercise, done as an online discussion, has provided practice in designing a data structure for a collection of objects.) In their initial meeting, they do a skills inventory and assign roles, so that each group has a leader/facilitator, tech lead, scribe, and writer-editors.

Over the years, students have been inventive in the types of collections they have chosen to design for, with some popular choices being chocolate bars, craft beers, and ice cream. Other databases have reflected the times we live in, such as face masks and hand sanitizers starting in spring semester 2020. Designing and building databases for objects that cannot be represented in familiar bibliographic records requires students to think from the ground up, being guided by the objects' attributes and what the intended users need as access points. Their design decisions about the kinds of fields to create mean that their group discussions center on conceptual challenges. For example: Will a simple dropdown list suffice for the users' search needs? Should we generate a controlled vocabulary for this field in order to support both aggregation and discrimination for searchers? Will users need to search product names as full phrases only, or should we build the field so they can search individual words within the names too? Or, more specifically, is it possible to have a name authority list for the chocolate bar manufacturers? Once they move on to implement the database design in the software used in the course, the software features need to support the design decisions.

Literature and learning contexts

Our search for a new database learning tool was informed by three main areas of relevant literature: andragogy; active, kinesthetic, and experiential learning; and threshold concepts. We were also designing curriculum for the unique learning environment of an exclusively online postgraduate program.

Andragogy, or the method and practice of teaching adult learners, is characterized by a focus on problem-solving; gaining practical, professional skills; and self-directed learning ([17]; [25]; [26]). Adult learners tend to be intrinsically motivated and prefer self-paced, task-oriented, practical instruction. They bring accumulated life experiences to the classroom ([6]). To facilitate the self-directed learning process—and, at the same time, to support those students who may not be highly self-directed—adult learning curricula should be designed to foster independent exploration, while also providing structure and robust resources. In addition, because one of the key characteristics of adult learners is their need to understand the applicability of the course material ([17]), it is critical to design assignments that clearly relate to students' life experiences.

Also fundamental to the design of this introductory course was the principle of active learning, characterized by "intentional engagement, purposeful observation, and critical reflection" ([15], p. 39). By applying the course concepts to the design and construction of a functioning database, students actively engage in the learning process, in hands-on, kinesthetic learning activities. Throughout the course, students also reflect on concepts learned, including critical evaluation of a peer group's database design and a reflection essay on their own group database. They apply concepts through active learning by implementing their designs in the database software, further supporting them in independent exploration and self-direction. This course design encourages students to engage in deep learning as they "seek meaning, connect to prior knowledge, actively engage material, view content from multiple perspectives, and organize content holistically with a deliberate intention to master content" ([28], p. 36). The course's experiential learning activities ([18]; [19]) are not only engaging ([4]) but also perceived by the students as enjoyable, aligned with the aphorism "To put what you have learned into practice—isn't that a great pleasure?" ([9], p. 3). In addition, they are internalized and support the integration of new knowledge into an adult learner's professional identity ([10]; [33]).

The third key area of relevant literature is threshold concepts theory, which was developed to aid in curriculum design ([27]). Briefly, threshold concepts are critical concepts that, once understood, transform the learner's understanding of the domain and affect the learner's worldview and identity as the new knowledge is internalized ([10]). They are further characterized by being irreversible, integrative, and deeply affective of the student's discourse and prior conceptual stances ([23]). When threshold concepts are used in curriculum design, learning activities are intentionally designed, with opportunities for students to discover new knowledge and to make connections between concepts and experiences ([22]). The course's learning objectives are informed by threshold concepts for the domain of information retrieval, as investigated in previous research ([32]; [35]). For example, grasping the threshold concept of information structures ([34]) may be manifested in evidence of students' effective use of database design tools and understanding fields, subfields, and filters.

Courses in the MLIS program are conducted totally online and are designed for accessibility, with materials provided in a variety of formats (audio, written, video with captioning) and with a structure that is conducive to flexible engagement (i.e., easy to start and stop, multiple on and off ramps). Flexibility is, in fact, the most-cited strength of the MLIS program, based on exit surveys with new graduates ([30]). To facilitate a successful remote learning experience, instructors follow guidelines to

Point learners to resources to support the use of the technology. Include links to troubleshooting tips and accessibility standards. Create or cultivate a short video on signing up for, accessing, and using the basics of the selected software—or facilitate a walk-through of these basics as time permits in a synchronous session. Provide examples from previous students alongside basic best practices in designing visuals. ([28], p. 46)

Students engage with the course via a range of devices—phones, tablets, laptops, and desktops—and this informed our work, from our choice of a browser-based database design tool to creating learning materials in responsive formats. Criteria used in evaluating database software to use in the course are covered in the section below on "Evaluation of candidate software." To consolidate the materials, we created a WordPress™ blog to serve as a digital hub for the course, with database assignment instructions, software tutorials in two formats, screencasts and annotated screenshots, and FAQs: https://ischoolblogs.sjsu.edu/202/.

Another aspect of the course is that projects are done in small groups, with group meetings held via Zoom™. Virtual groupwork not only serves to enhance student engagement and comprehension; it also prepares students for a future work life that includes collaborating with colleagues across physical locations and time zones. As [28] noted, collaborative assignments can be "challenging to replicate in online settings, yet [this is] the most vital skill sought by employers" (p. 42). In this introductory course, students first build a simple database on their own, following provided specifications. Then they collaborate with others to design and build a more complex database, thus fostering their "self-direction and self-regulation, which can help learners meet the course goals" ([8], p. 11). This structure allows students to get their feet wet with the concepts and the software before moving on to more advanced work in database design. Along with the database software, students utilize many other digital collaboration tools, such as Zoom™, Google Docs™, Google Drive™, WordPress™, Canvas®, and more. These tools are beneficial for learning database design, particularly in a fully online program, and to develop skills for careers that typically include remote/hybrid teamwork.

Student perceptions of the complexity of the concepts in the course also played a role in the process of selecting a new software. From previous anecdotal reporting, we were keenly aware of anxiety among students around the technical aspects of the course assignments. With this in mind, we gave high priority to choosing software with an easy-to-learn interface for the essential creator functions of implementing a database design, creating records, and building search forms. (We also intentionally measured student anxiety levels at the start of the course in the summer pilot and the fall roll-out courses, finding that over 50% of students surveyed indicated some level of anxiety about learning to use the software, with 25% reporting high levels of anxiety. See Figures 1 and 4.)

Methodology

We conducted three preliminary stages of research to prepare for selecting candidate software for the course. The first stage was an environmental scan of introductory courses in information retrieval or information organization in MLIS degree programs in the United States and Canada in order to explore software being used in similar courses. Second, we constructed a list of software features necessary to the learning objectives, based on existing curriculum and future plans. Third, we explored the marketplace, read software reviews, and tested candidate products.

Environmental scan

An essential early step in our search for a new database learning tool was conducting an environmental scan of MLIS programs to find 10 to 15 programs comparable to San José State University's. To compile our list, we cross-referenced schools on the Association for Library and Information Science Education (ALISE) member list and the iSchools.org member list ([16]), prioritizing larger programs (i.e., higher student enrollment) and those with the iCaucus membership level. We reviewed program websites (course catalogs, syllabi, course pages) to determine the scope of introductory information retrieval systems and design courses. We were aware that software such as MS Access™ was used in non-introductory and elective database courses, but the students and learning objectives were too different for this information to be useful and was outside the scan's boundaries. We faced a number of challenges in this stage of the environmental scan. The vast majority of the programs we reviewed provided little in the way of specific course information. In general, programs provided a short overview of course topics, but no publicly available syllabi or detailed description of course objectives. In cases where syllabi were published, they were often vague and did not provide detailed information about any software used to learn about databases. Our impression was that most courses appeared to teach information retrieval systems in a conceptual manner, with no hands-on component. Our first list included 29 relevant courses at 20 schools. For the 12 courses with a hands-on component, but without a detailed syllabus, we reached out via email to the instructors. The four faculty members who responded indicated that they were utilizing MySQL™, which did not meet our criteria for a browser-based program or for an introductory course. See summary in Table 1.

Table 1: Results from environmental scan of MLIS courses on information organization and retrieval

<table><colgroup span="1"><col align="left" span="1" /><col align="left" span="1" /><col align="left" span="1" /><col align="left" span="1" /></colgroup><thead><tr><th align="left" rowspan="1" colspan="1">MLIS program with relevant courses</th><th align="center" rowspan="1" colspan="1">Relevant courses identified</th><th align="center" rowspan="1" colspan="1">Syllabi needing clarification</th><th align="center" rowspan="1" colspan="1">Clarifying responses received from faculty</th></tr></thead><tbody><tr><td align="left" rowspan="1" colspan="1">20</td><td align="center" rowspan="1" colspan="1">29</td><td align="center" rowspan="1" colspan="1">12</td><td align="center" rowspan="1" colspan="1">4</td></tr></tbody></table>

Evaluation of candidate software

As we conducted our environmental scan, we simultaneously developed a set of software requirements based on assignment goals, learning objectives, and university security requirements for enterprise software. Candidate software needed to be browser-based to meet the wide range of devices used by students. Of primary importance for the feature set were the following:

Creator-side (database-building) features:

user-friendly interface for creator functions

variety of field formats (lists, text, numbers, yes/no)

word and phrase parsing (syntactic parsing and indexing)

ability to create a variety of search pages and submission forms

ability to indicate that a field is required; that is, data must be entered into the field when a record is created

access sharing (with the ability to grant specific, limited, or different levels of access), necessary for groups designing a database to grant testing access to another group for evaluation purposes

User-side (search and results) features:

user-friendly interface for user functions, such as search pages

ability to search on multiple fields, combined with different logical operators

clear results pages

Because our environmental scan did not reveal any potential candidates, we returned to Ninja Tables, a software package that had been previously identified and tested by course faculty. Although it was user friendly and had the ability to function as a WordPress plug-in, it failed to meet other requirements, such as phrase parsing, required field setting, and limited shared access. In addition, the search function was a Google-style search bar, lacking in precision, although it did have the ability to specify which fields to search. We also evaluated AirTable and Kintone; however, these products fell short of our user-side requirements, including a lack of robust search pages and ability to parse search terms. By conducting Google searches for software reviews, as well as checking CNET™ and Capterra™ (websites that review software), we identified Caspio™ as a serious candidate for further evaluation. Table 2 summarizes the results of the software evaluations.

Table 2: Results from evaluation of top candidate software products

<table><colgroup span="1"><col align="left" span="1" /><col align="left" span="1" /><col align="left" span="1" /><col align="left" span="1" /><col align="left" span="1" /><col align="left" span="1" /><col align="left" span="1" /></colgroup><thead><tr><th align="left" rowspan="1" colspan="1">Application</th><th align="left" rowspan="1" colspan="1">Parsing (word vs. phrase)</th><th align="center" rowspan="1" colspan="1">Required field</th><th align="left" rowspan="1" colspan="1">Search form</th><th align="left" rowspan="1" colspan="1">User friendly</th><th align="center" rowspan="1" colspan="1">Shared access</th><th align="left" rowspan="1" colspan="1">Relevant extra features</th></tr></thead><tbody><tr><td align="left" rowspan="1" colspan="1">Ninja Tables</td><td align="left" rowspan="1" colspan="1">Yes</td><td align="center" rowspan="1" colspan="1">No</td><td align="left" rowspan="1" colspan="1">Yes, with custom filters</td><td align="left" rowspan="1" colspan="1">Yes</td><td align="center" rowspan="1" colspan="1">Yes</td><td align="left" rowspan="1" colspan="1">&#8212;</td></tr><tr><td align="left" rowspan="1" colspan="1">AirTable</td><td align="left" rowspan="1" colspan="1">No</td><td align="center" rowspan="1" colspan="1">No</td><td align="left" rowspan="1" colspan="1">No, just a search bar</td><td align="left" rowspan="1" colspan="1">Yes</td><td align="center" rowspan="1" colspan="1">Yes</td><td align="left" rowspan="1" colspan="1">&#8212;</td></tr><tr><td align="left" rowspan="1" colspan="1">Kintone</td><td align="left" rowspan="1" colspan="1">No</td><td align="center" rowspan="1" colspan="1">Yes</td><td align="left" rowspan="1" colspan="1">No, just a search bar</td><td align="left" rowspan="1" colspan="1">Somewhat</td><td align="center" rowspan="1" colspan="1">Yes</td><td align="left" rowspan="1" colspan="1">&#8212;</td></tr><tr><td align="left" rowspan="1" colspan="1">Caspio&#8482;</td><td align="left" rowspan="1" colspan="1">Yes (Comparison types on search report)</td><td align="center" rowspan="1" colspan="1">Yes</td><td align="left" rowspan="1" colspan="1">Yes, on separate page shared with URL or embedded on another page</td><td align="left" rowspan="1" colspan="1">Yes, but a little more complicated than Ninja&#160;Tables</td><td align="center" rowspan="1" colspan="1">Yes</td><td align="left" rowspan="1" colspan="1">Add records on a form; can embed on WordPress blog</td></tr></tbody></table>

After determining that Caspio was the top contender for meeting our creator and user requirements, we evaluated its functionality against the course's two database construction assignments. At the same time, we assessed the user-friendliness of the interface, with the hope that it was neither too advanced nor so simplified that students would get limited benefit from the exercises and from implementing their original databases designs using the software. In our initial trials, we found Caspio to be easy to learn for basic database features, while also leaving room for complex database design questions and applications.

Once these initial evaluation steps were completed, we proceeded to develop prototype course assignments and support materials in order to evaluate the Caspio software in user testing with students.

Redesigning a core course: Iterative user research

We outlined three phases of user research to evaluate, test, and fine-tune the course materials and to help ensure a smooth launch in the first semester of the redesigned course: (1) alpha testing; (2) beta (pilot course); and (3) roll-out (15 sections of course). In preparation for the alpha phase, we developed prototype assignments and support materials, including software tutorials and screencasts on a new WordPress blog. Throughout the project, we were guided by the threshold concepts underpinning the course, as expressed in the course's learning objectives (CLOs) ([31]). The course has nine CLOs, inclusive of vocabulary design and information architecture basics. The project focus was on the three CLOs relevant to database design:

Identify an appropriate user group for an IR product, assess their information needs, conduct user research, and design an information retrieval system to meet those needs;

Explain and apply basic design principles for usability, focused on the content and organization of information for retrieval;

Learn database management software in order to implement database design, information structures, and create search interface. ([31], para. 7)

As noted earlier, this course is one of three required core courses for the MLIS degree, and it supports four core competencies in addition to its CLOs. Together with a required research methods course, the core courses support all of the program's 14 degree competencies ([29]), which are based on the American Library Association's (ALA) professional competence list ([1]). (The supported competences are regularly updated to reflect revisions in the ALA competences; see [2]). The course supports these four competences:

E. Design, query, and evaluate information retrieval systems.

F. Use the basic concepts and principles related to the selection, evaluation, organization, and preservation of physical and digital information items.

G. Demonstrate understanding of basic principles and standards involved in organizing information such as classification and controlled vocabulary systems, cataloging systems, metadata schemas or other systems for making information accessible to a particular clientele.

H. Demonstrate proficiency in identifying, using, and evaluating current and emerging information and communication technologies. ([29], para. 3)

With the central concepts and course activities designed based on threshold concepts curriculum elements ([22]), the assignments support students in actively discovering and experiencing new knowledge ([33]) of the competences.

The user research was aimed at evaluating the materials for these objectives and revising them iteratively, using three phases:

Alpha test: In late spring 2022, we conducted an alpha test with four students who had taken the previous version of the course and used the previous database software.

Beta test: In summer 2022, the course was piloted with 58 students in two sections.

Full roll-out: In fall 2022, the course was offered in 15 sections taught by eight instructors, with 435 students.

Phase 1: Alpha test—Participant interviews

The first phase of user research was an alpha test with four participants, students who had completed the course in the last six months using the previous (WebData Pro) software. The participants were selected by the lead researcher, based not on their achievements in the course but on their ability to verbalize their thoughts and to be uninhibited—even with their previous instructor present—in sharing their thoughts about the materials and the software. As we planned to use a think-aloud protocol ([12]; [24]) and semi-structured interview questions ([20]), these abilities were critical to eliciting a meaningful understanding of their learning experiences and feedback on the new materials. As described by [36], there are considerable differences in subjects' abilities to verbalize their thoughts while undertaking assigned tasks. Prior to the alpha test, we conducted a trial run of the exercise instructions and the interview script, which helped us be prepared for questions about the software interface (e.g., "Why was it called search page before but search-and-report now?"), but we made no major changes to the script.

As noted by [21], "Instruction in the think-aloud strategy must be modeled" (p. 497). Accordingly, we prepared each participant by demonstrating how think-aloud is done. Next, we described the scope of the session: (1) they would be asked to do an exercise, conducting specific tasks to build a database and a search form, and we wanted them to speak aloud what they were thinking as they did the tasks so we could better understand their decision making, impressions, and experiences; (2) they would use the exercise instructions and the tutorials, including watching one of the screencasts; (3) when the exercise was completed, we would ask a few questions about the experience, providing the opportunity to "think after" ([5]); and (4) they would review the project assignment instructions on their own after the interview and complete a questionnaire about the instructions and the full experience. Each interview session lasted 60 to 70 minutes, and each participant spent 20 to 30 minutes completing the questionnaire afterwards.

The most significant results from the alpha test came from the interviews and think-aloud narratives. We were able to explore each participant's experience with the new materials in nuanced and individualized ways, using prompts and clarifying questions to help in understanding their impressions and the sources of any obstacles encountered. The questionnaire given to them as a take-home activity provided further feedback, and it elicited specific input on the project instructions. This was key to refining the project instructions in preparation for Phase 2, the pilot version of the course. With the small number of participants (four), a typical size for alpha testing, we could not generalize from the questionnaire data; however, several findings were noteworthy:

three of the four students found the software interface "very easy" to navigate; the other found it "easy";

regarding the take-home evaluation of the project instructions, all four students found the training materials to be "more than sufficient" or "very sufficient";

all students commented on the strength of having both formats of slides with annotations and screencasts.

We determined that these findings would not only help to guide modifications to the materials before piloting the revised course but also serve as a baseline for the evaluations we would conduct in the next two phases.

Phase 1 outcome: Modifications to learning materials

The primary purpose of the Alpha (Phase 1) had been twofold: first, to make appropriate refinements to the two assignment instructions by addressing any confusion or incompleteness encountered by the alpha participants; and second, to modify the tutorials as needed to resolve any gaps or areas that caused confusion for the alpha participants. We added FAQs to the blog and fixed terminology so that the screencast narratives and screenshots of the product were precisely aligned. Last, we added links from the annotated screenshots to the relevant screencast(s), making it easier for students to use both formats of the tutorials.

Phase 2: Beta test—Piloting the course and questionnaire

Phase 2, the beta test of the course pilot, was conducted during summer 2022. Summer semester has a compressed schedule, with the usual 16 weeks of course content covered in 10 weeks. Also, just two sections of the course are offered, in contrast to 14 to 15 sections offered in the fall and spring semesters. In summer 2022, the two sections were taught by different instructors (one being the first author), with a total of 58 students.

We used a questionnaire to collect feedback on the assignments and support materials, with 13 questions covering three areas: (1) pre-existing knowledge, anxiety level, group role; (2) assessment of specific aspects of the materials; and (3) final take-aways and reflections on the software interface, assignments, overall experience, and advice for future students. Forty-seven students submitted the questionnaire, a response rate of 81%. We attributed the high response rate to having told the students about the importance of their role in the design of the course materials, for example, with announcements about "You are the first group of students to use Caspio and the new assignments—your input is very important!"

Results from the questionnaire indicated that very few students had pre-existing knowledge of database design (4.3%) and that a high number had anxiety about learning to use database software (78.7%) (see Figure 1).

Graph: Figure 1: Phase 2—Students' pre-existing knowledge and anxiety level

Students in the pilot gave high ratings, 85.1%, to the effectiveness of the support materials provided on the new course blog. They also reported on how they engaged with the materials, whether by using the annotated screenshots, screencasts, or a combination, and they made suggestions about materials to add to the blog. A follow-up question measured their confidence level after completing the individual exercise, which was high at 66% (see Figure 2).

Graph: Figure 2: Phase 2—Materials: Overall effectiveness, forms of engagement, and usefulness ratings

In evaluating the overall user experience, we learned that 66% of students found the software interface to be easy or very easy and intuitive to navigate (see Figure 3). Students were also asked to summarize their experience and to make recommendations or share ideas for improvements. See Figure 3 for responses representing themes in the student reflections and suggestions.

Graph: Figure 3: Phase 2—Overall user experience

Phase 2 outcome: Modifications to learning materials

With the feedback from students in the pilot version of the course, we were able to significantly improve the assignment instructions and the blog support materials. Specifically, we determined that we needed to cover several advanced topics; however, we were selective here. In some cases, the more eager students reporting diving into Caspio's own tutorials to find their answers, but in other cases, we felt that it would be best to create our own. Although one student opined in the feedback survey that there should be tutorials for every possible database function, we recognized that this was not feasible nor even desirable in an introductory course. Ultimately, we covered six additional features, adding two new tutorials and four explanations integrated into existing tutorials. We also learned that the blog's discussion forum was used by only one student and that it had generated confusion about when to use the Canvas course's discussion forum; as a result, we removed it. For the assignment instructions, we added embedded links to relevant tutorials, making it easier for students to jump to the supporting materials on the blog.

Phase 3: Roll-out and full course evaluation—Questionnaire

For Phase 3, we used the Phase 2 questionnaire after including an additional question to address the added tutorials. This was distributed to all 15 courses sections (435 students). It is worth noting that one of the priorities in the development and offering of the course is having consistent outcomes but also ensuring that it is not experienced by students as "canned." To achieve this balance, it is critical that faculty reflect their own areas of expertise and their own teaching style. There is shared content for the assignments, resources, and customized textbook, but also open modules for which instructors create their own materials. In addition, each instructor creates videos to introduce each shared module and assignment. During the roll-out semester in fall 2022, instructors strongly encouraged students to complete the questionnaire, letting them know how important their feedback was. As a result, the response rate was quite high (43%), although lower than in the summer pilot course (81%). This was due largely to having offered a small extra credit for completing the survey in Phase 2, whereas the Phase 3 survey completion was not incentivized. Multiple studies, however, have shown that there is little difference between survey responder and non-responder data in surveys of college students ([13]).

The lack of pre-existing knowledge of the respondents in Phase 3 was similar to the students in Phase 2, as shown in Figure 4, with 71.3% reporting that almost everything was new to them at the start of the course (compared to 68.1% in Phase 2, the summer pilot course). The overall level of anxiety was similar, but differently distributed (see Figure 4).

Graph: Figure 4: Phase 3—Students' pre-existing knowledge and anxiety level

Eighty-six percent of respondents found the blog materials effective in supporting them, and 69.7% felt confident or highly confident to do the group project after completing the individual exercise. Student engagement with the materials most typically took the form of using the screencasts while working on the assignments (43%), while 38% used a combination of the PDF slides and screencasts. In their ratings of the new course materials on the blog, students gave especially high marks to the screencasts, as was true in Phase 2 (see Figure 5).

Graph: Figure 5: Phase 3—Materials: Overall effectiveness, forms of engagement, and usefulness ratings

Questions about the overall user experience revealed that 60.1% of students found the software interface to be easy or very easy and intuitive to navigate (see Figure 6); this was lower than in the summer pilot of the course. We attributed this difference to two factors at play: first, summer students are more likely to be carrying a lighter courseload and thus have more time to explore the software features; and second, summer students were highly aware that they were piloting the materials and likely felt more adventurous with the software and the assignments. This high level of engagement was in line with their much higher response rate to the questionnaire (81% in Phase 2 vs. 43% in Phase 3). When asked about any recommendations for improvements to the materials, students provided useful suggestions. Figure 6 summarizes the most common themes.

Graph: Figure 6: Phase 3—Overall user experience

Phase 3 outcome: Modifications to learning materials

There were three main outcomes from the questionnaire results: we created an additional tutorial on list fields, expanded the definitions page, and did further refinement of the blog architecture to make the tutorials easier to find.

Discussion

There were multiple lessons learned from the experience of selecting software for the information retrieval course, evaluating it, creating assignments and support materials, and conducting user research at each critical stage. The primary lesson was that the project of selecting new software for the course, prompted initially by the previous software's issues of security and low user-friendliness, led into a significant exploration of the experiential learning taking place for students. We implemented rigorous user research in three phases, each of which led to improvements in both the assignments and the support materials. This also supported a smooth roll-out in the inaugural semester to 435 students in 15 sections of the course.

Our exploration of different software products was also a lesson in the more advanced features of database implementation. This was somewhat surprising, given that we were selecting software for an introductory course, one required of all MLIS students. There were also licensing considerations; however, our technical support team handled vetting the software for meeting campus security requirements. Caspio offers various levels of access at different price points, including a free version available to the general public and a free version for academic institutions. We initially pursued the free academic version, which requires an annual approval process at the university level, thinking that other faculty and departments would find the software useful for research purposes. However, there was concern about potential disruption to the course should the annual re-application be denied or terms modified. The primary limitation of the free version was a low cap on the number of active datapages (search pages and submission forms); however, we determined this would be manageable for the current scope. Indeed, we found that over the span of the semester, only a few student project groups reached this limit when making duplicate datapages for testing purposes, and the situation was easily resolved by deactivating datapages no longer needed. Another limitation of the free version is that users do not have access to Caspio technical support via phone, email, or chat, although there is a public community forum in which they are free to engage. This has also turned out to be a non-issue, as evidenced in having no technical support requests in the most recent semester.

The cost and licensing issues we encountered when evaluating different software were not insignificant. We had established that low or no cost for students was critical to our decision-making, in addition to all the features and security requirements, so that we considered only the free academic or trial versions of software. Limitations on how many database tables and datapages could be built by an individual user account were varied. Some products also had time restrictions in place, for example, a two-week limit on free accounts. Fortunately, Caspio's restrictions on free accounts were within our course needs and learning objectives. While our information technology lead staff person was in communication with the Caspio account manager regarding security requirements, we continued to evaluate the product's search and build features. One fortunate outcome was that we ultimately purchased a paid account for use by faculty as we came to realize the benefits of Caspio for research purposes; the aspects of these applications extend beyond the scope of this article.

In brief, Caspio is a product with a variety of industry applications and thus has many elements extending beyond the introductory concepts and applications covered in the course. This was another factor in our decisions about the support materials we created from scratch. After thoroughly reviewing the tutorials and help sections on the Caspio website, we determined that most of them covered functions at a level of complexity outside the course scope. We decided that creating our own materials would facilitate student access to highly relevant help resources and wrap them in the context of the course's design concepts. This also meant students would not need to sift through Caspio's materials. We still included links to Caspio's resources on the blog but were highly selective in this regard; the Caspio resources we recommended were listed on the main tutorial page. After Phase 2, the pilot course, student questionnaire results revealed six additional topics that needed to be covered by new tutorials (for example, how to construct cascading list fields), requiring that we create two new screencasts and four additional explanations.

During the summer pilot course (Phase 2) and the inaugural semester in fall 2022 (Phase 3), the project assistant provided software support for all course sections, answering student and faculty questions related to the use of Caspio for the two assignments. Conceptual questions about database design were handled by faculty. However, we expected and indeed found that conceptual and technical questions were often intertwined. For example, a question about using controlled vocabulary might appear to be a simple technical question ("How do I set up a list field for controlled vocabulary?"), whereas the faculty member might view the question as an opportunity to discuss controlled vocabulary as a design choice ("When should a field have a controlled vocabulary? What factors and alternatives should be considered?"). This was not a significant issue, however, and would be alleviated over time as instructors became more fluent in the software. In fact, in the very next semester—spring 2023, with 441 students taking the course—the statistics on support requests showed that no students required further help from the designated IT staff or the course coordinator. This outcome is especially striking when compared to previous semesters when different software was in place: the support request count was 78 in spring 2022 and 109 in fall 2021. It should also be noted that, in spring 2023, instructors had had only one semester of experience with the new materials and software. This was perhaps the most compelling evidence that the new software and course materials were highly successful. Key to this success was the extensive and iterative user research. Based on this, we recommend to educators involved in similar projects that user research in iterative phases be conducted with students. With each phase, we refined the assignments, the tutorials, and the blog's help options.

We also knew we needed additional perspectives on the content and format of materials to ensure they would be useful and accessible to the wide range of students in the MLIS program. The design of the instructional materials was grounded in threshold concepts, as discussed above, and informed by the teaching experience of the course coordinator and by the professional and student experiences of the project assistant. Additionally, while our alpha testing validated some of our assumptions and design choices, it also revealed areas for refinement and improvement. Particularly in an online educational environment, students access material in a variety of formats and device configurations. While some students prefer digital content, others choose to print everything or to utilize a combination of printed and digital materials. Further, digital formats vary from phone and tablet to laptop and desktop with multiple monitors. Although we had anticipated this based on experience, it was critical to observe as students interacted with the materials in their preferred ways. As a result, two more recommendations are to use a mix of methods in the user research and to include in the early/alpha phase both think-aloud and semi-structured interviewing protocols. We found that this mixed approach elicited the richest input from students on the initial drafts of the materials. In the two later phases, the questionnaire collected highly useful data from a large number of students (n = 493), both about the materials and about the students, their incoming knowledge and anxiety, and their confidence upon course completion.

The student questionnaire was also a valuable source about findability and navigability of the blog, and the feedback led to restructuring its tutorials and definitions sections. We originally designed the blog in such a way that the tutorials were organized by relevant assignment, as well as a general "more tutorials" page accessed via a dropdown menu. However, feedback indicated that students were overlooking this additional page and missing the dropdown menu. In our redesign, we kept the assignment-specific pages but created a new page, entitled "Tutorials & Tips," that contained all tutorials, as well as a definitions page and a troubleshooting guide. In order to have ongoing assessment of the course materials, the Phase 3 questionnaire will continue to be used, and we are also continuing to track support requests. Our last recommendation is to put in place a plan for ongoing evaluation.

Conclusion

The project's two aims were well met. First, through the processes of the environmental scan, feature-centered evaluation, and subsequent testing against the course learning objectives and assignments, we selected the Caspio software product. Through questionnaire results with 493 students in the redesigned course who reported high confidence in their abilities, we are likewise confident in the software choice and in the blog tutorials to support their learning about database design. In order to have ongoing assessment of the materials, the questionnaire will continue to be used. For the second aim of redesigning the assignments for enhanced learning experiences, we knew at the outset of the project that iteration in the design of curriculum is essential, just as it is in the design of information products. Modeling this practice, we developed the assignments and tutorials in concert to address the threshold knowledge of the course learning objectives.

There were also beneficial outcomes from the project beyond its two initial aims. As is often true in academic environments, many students were content to follow the tutorials and design the databases according to assignment instructions, while others wanted to explore the concepts and skills more deeply. As one instructor observed, "I think [the redesign] allows for more creativity as students submitted databases that went above and beyond assignment instructions" (San Nicolas-Rocca, personal communication, May 31, 2023). The three phases of user research supported the refinement of the assignments and the tutorials so that they could be rolled out successfully, but also so that the spectrum of student learning and discovery aspirations could be supported. The project led to valuable findings about the student learning experiences, to improvements in the curriculum, and to recommendations for educators involved in similar courses. To summarize, we recommend the following: (1) conducting user research in iterative phases to evaluate course assignments and self-guided support materials; (2) using a mix of research methods and including both think-aloud and semi-structured interviews in the early/alpha phase; and (3) putting in place a plan for ongoing evaluation.

Limitations

During the project we encountered some challenges, and there were limitations on the scope of the study. Both are summarized here in the general timeframe order in which they occurred.

For the environmental scan, we set intentional boundaries on the courses examined due to the introductory level of our course and the context of our university. This included limiting the scan to larger iSchools, drawing from both the [3] and [30] directories, and restricting the scope to required, introductory courses on information retrieval and organization. We encountered issues of limited availability of course syllabi which then led to corresponding directly with course instructors to gather more information. In addition, we were aware that more advanced software, such as MS Access, was used in non-introductory and elective courses on database design topics, but the students we were studying and the course learning objectives were too different for this information to be useful and so it was retained outside the scan.

In the vetting of the software, we discovered that university security requirements were more complex than anticipated and were undergoing continuing updates. The latter was not surprising, given the constant changes in online security concerns and threats. Fortunately, our technical staff was able to shepherd our software selection through the extensive approval processes.

We found that the three phases of user research were critical to refining the course materials, both the assignments and the support tutorials on the blog. In addition, we put in place an ongoing evaluation program of faculty review each term and student surveys conducted annually.

Acknowledgment

We are deeply grateful to two members of the School of Information's technology team, Abigail Laufer, the iSchool's network administrator who coordinated technical communications with Caspio, and Robert Lucore, who helped with setting up the blog platform, and for the support of Dr. Linda Main, associate director of the iSchool. For early reviews of the manuscript, we thank Drs. Michael Stephens and Robert Stoops, and we are grateful to the anonymous peer reviewers whose feedback contributed to improving the final manuscript.

References

1 American Library Association (ALA). (2009).. ALA's core competences of librarianship. https://www.ala.org/educationcareers/sites/ala.org.educationcareers/files/content/careers/corecomp/corecompetences/finalcorecompstat09.pdf

2 American Library Association (ALA). (2021).. Draft version of ALA's core competences of librarianship. https://www.ala.org/educationcareers/sites/ala.org.educationcareers/files/content/education/Draft%20-%20ALA%20Core%20Competences%202021%20Update.pdf

3 Association of Library & Information Science Educators (ALISE). (2021).. ALISE statistical report. https://ali.memberclicks.net/assets/documents/statistical_reports/2021/ALISE%20Statistical%20Report%202021%20.pdf

4 Bindal, N. (2022). Experiential learning in design education: Teaching construction and technology through active experimentation in interior and architectural design.. International Journal of Design Education, 16(2), 91–102. https://doi.org/10.18848/2325-128X/CGP/v16i02/91-102

5 Branch, J. L. (2000). Investigating the information-seeking processes of adolescents: The value of using think alouds and think afters.. Library & Information Science Research, 22(4), 371–392. https://doi.org/10.1016/S0740-8188(00)00051-7

6 Cercone, K. (2008). Characteristics of adult learners with implications for online learning design.. AACE Journal, 16(2), 137–159.

7 Clarke, R. I., Amonkar, S., & Rosenblad, A. (2019). Design thinking and methods in library practice and graduate library education.. Journal of Librarianship and Information Science, 52(3), 749–763. https://doi.org/10.1177/0961000619871989

8 Conceição, S. C. O. (2021). Setting the stage for facilitating online learning.. New Directions for Adult and Continuing Education, 2021(169), 7–13. https://doi.org/10.1002/ace.20410

9 Confucius. (1998).. The analects. D., Hinton, Trans. Perseus Books.

Cousin, G. (2006). An introduction to threshold concepts.. Planet, 17, 4–5.

Dobreski, B., Zhu, X., Ridenour, L., & Yang, T. (2022). Information organization and information retrieval in the LIS curriculum: An analysis of course syllabi.. Journal of Education for Library and Information Science, 63(3), 335–350. https://doi.org/10.3138/jelis-2021-0057

Ericsson, K., & Simon, H. (1993).. Protocol analysis: Verbal reports as data (2nd ed.). MIT Press.

Fosnacht, K., Sarraf, S., Howe, E., & Peck, L. K. (2017). How important are high response rates for college surveys?.. The Review of Higher Education, 40(2), 245–265. https://doi.org/10.1353/rhe.2017.0003

Gibbons, S. (2016, July 31).. Design thinking 101. Nielsen Norman Group. https://www.nngroup.com/articles/design-thinking/

Graffam, B. (2007). Active learning in medical education: Strategies for beginning implementation.. Medical Teacher, 29(1), 38–42. https://doi.org/10.1080/01421590601176398

iSchools. (2023).. iSchools member list. https://www.ischools.org/members

Knowles, M. S. (1988).. The modern practice of adult education: From pedagogy to andragogy (rev. ed.). Cambridge Book Company.

Kolb, A. Y., & Kolb, D. A. (2006). Learning styles and learning spaces: A review of the multidisciplinary application of experiential learning theory in higher education. In R. R., Sims & S. J., Sims (Eds.). Learning styles and learning (pp. 45–91). Nova Science.

Kolb, D. A. (2014).. Experiential learning: Experience as the source of learning and development (2nd ed.). Pearson FT Press.

Kvale, S., & Brinkmann, S. (2009).. InterViews: Learning the craft of qualitative research interviewing (2nd ed.). SAGE.

Kymes, A. (2005). Teaching online comprehension strategies using think-alouds.. Journal of Adolescent & Adult Literacy, 48(6), 492–500. http://doi.org/10.1598/JAAL.48.6.4

Land, R., Cousin, G., Meyer, J. H. F., & Davies, P. (2006). Implications for course design and evaluation. In J. H. F., Meyer & R., Land (Eds.). Overcoming barriers to student understanding: Threshold concepts and troublesome knowledge (pp. 195–206). Routledge.

Land, R., Meyer, J. H. F., & Baillie, C. (2010). Threshold concepts and transformational learning. In R., Land, J. H. F., Meyer, & C., Baillie (Eds.). Threshold concepts and transformational learning (pp. ix–xlii). Sense.

Lewis, C. & Rieman, J. (1994). The thinking aloud method. In C., Lewis & J., Rieman (Eds.). Task-centered user interface design: A practical introduction (section 5.5). http://hcibib.org/tcuid/

Merriam, S. B., & Bierema, L. L. (2013).. Adult learning: Linking theory and practice. Wiley.

Merriam, S. B., Caffarella, R. S., & Baumgartner, L. M. (2007).. Learning in adulthood: A comprehensive guide (3rd ed.). Wiley.

Meyer, J. H. F., & Land, R. (2003). Threshold concepts and troublesome knowledge (1): Linkages to ways of thinking and practising within the disciplines. In C., Rust (Ed.). Improving student learning: Ten years on (pp. 1–16). Oxford University Press.

Murray-Johnson, K., Munro, A., & Popoola, R. (2021). Immersive deep learning activities online.. New Directions for Adult & Continuing Education, 2021(169), 35–49. https://doi.org/10.1002/ace.20412

San José State University [SJSU], School of Information. (2019).. Program learning outcomes–MLIS: Statement of core competencies. https://ischool.sjsu.edu/mlis-program-learning-outcomes

San José State University [SJSU], School of Information. (2022).. MLIS program performance: Graduating student exit survey data. https://ischool.sjsu.edu/mlis-program-performance#exit-survey

San José State University [SJSU], School of Information. (2023).. INFO 202: Information retrieval system design—Course learning objectives. https://sjsu.campusconcourse.com/view_syllabus?course_id=5310&public_mode=1

Tucker, V. M. (2016). Learning experiences and the liminality of expertise. In R., Land, H. F., Meyer, & M. T., Flanagan (Eds.). Threshold concepts in practice (pp. 93–106). Sense Publishers. https://doi.org/10.1007/978-94-6300-512-8_8

Tucker, V. M. (2021). Becoming an information architect: The evolving librarian's skillset, mindset, and professional identity.. Education for Information, 37(4), 485–500. https://doi.org/10.3233/EFI-211558

Tucker, V. M., & Edwards, S. L. (2021). Search evolution for ease and speed: A call to action for what's been lost.. Journal of Librarianship and Information Science, 53(4), 668–685. https://doi.org/10.1177/0961000620980827

Tucker, V. M., Weedman, J., Bruce, C. S., & Edwards, S. L. (2014). Learning portals: Analyzing threshold concept theory for LIS education.. Journal of Education for Library & Information Science, 55(2), 150–165.

Van Someren, M. W., Barnard, Y. F., & Sandberg, J. (1994).. The think aloud method: A practical guide to modeling cognitive processes. Academic Press.

Weedman, J. (2018). Design science in the information sciences. In J. D., McDonald & M., Levine-Clark (Eds.). Encyclopedia of library and information sciences (4th ed., pp. 1242–1255). CRC Press. https://doi.org/10.1081/E-ELIS4

By Virginia M. Tucker and Christina Perucci

Reported by Author; Author

Virginia M. Tucker is associate professor at the School of Information, San José State University. Her research and teaching focus on information architecture and user-centered design. She previously worked as a product architect, user training manager, and physics librarian. She has a PhD in information systems, Queensland University of Technology; MLS, University of California-Berkeley; and BA, Stanford University.

Christina Perucci is adjunct faculty at the School of Information, San José State University. She received a bachelor's degree in sociology from Tufts University, an MA in applied education psychology from Columbia University, and an MLIS from San José State University. Her interests include information organization, information access equity, and information literacy.