Understanding the information literacy competencies of UK Higher Education students
There has been some interesting debate regarding the assessment of students’ information literacy skills. Key questions have arisen such as: what standards and criteria should we use to assess students, what are we actually trying to measure, what type of test is the most appropriate, what do the results mean, how do we measure improvements and what are the effects of intervention.Our research project, funded by the LearnHigher Centre for Excellence in Teaching & Learning, decided to address the above questions via the use of psychometric tests. An online information literacy audit, the ILT (Wise et al., 2005), was used to assess the information literacy skills of a cohort of undergradute students in the Department of Information & Communications at Manchester Metropolitan University from 2008 to 2010. Based on the ACRL standards the ILT measures four of the five information literacy competencies.Our specific research aims were to ascertain if the testing methods were appropriate to UK students, identify areas for information literacy improvement raised in the test scores, identify practitioner intervention strategies and find out whether they could make a difference to students’ information literacy levels. Test results (presented and discussed in detail in the chapter) indicated that students struggled with (predictable) areas of Information Literacy, and that (a) identification and intervention in this area is useful to students and information literacy helpers (whoever they are); (b) intervention needs to be ongoing – not only in Year 1; (c) even up to their final year students continue to struggle with the same particular activities; and (d) a variety of approaches to support may be needed to help students develop their skills.Following completion of the testing, discussions have taken place with the CILIP CSG for information literacy regarding the creation of a UK information literacy question bank.
There has been some interesting debate recently regarding the assessment of students’ information literacy skills. A number of key questions have arisen such as: what standards and criteria should we use to assess students, what are we actually trying to measure, what type of test is the most appropriate, what do the results mean, how do we measure improvements and what are the effects of intervention? Is a ‘one test fits all’ solution practical? While a number of information literacy audits or tests exist, it was decided that this research would use one of the most widely trialled information literacy tests, the online ‘Information Literacy Test’ (ILT) from Steven Wise, Lynne Cameron and their team at the Institute for Computer Based Assessment, James Madison University, Virginia, USA (Wise et al., 2005).
This chapter details a longitudinal assessment of an undergraduate cohort at the Department of Information & Communications, Manchester Metropolitan University, undertaken in the context of the LearnHigher CETL (Information Literacy Learning Area) research activities.
The work undertaken was longitudinal in nature and was repeated with the same group of students as they progressed from year 1 to year 2 to year 3 of their undergraduate programmes. The initial testing took place during January and February 2007 and was repeated in 2008 and 2009. The test was taken by students from the Common Undergraduate Programme of the Department of Information & Communications. The test measures performance in four of the five information literacy competencies identified in US formulated ACRL standards (ACRL, 2000). Analysis of the results of this first set of test results has been undertaken using SPSS and this chapter will present findings and recommendations for future work.
The results indicated that there were indeed significant areas of weakness in students’ ability measured against four of the five ACRL competency standards and, while intervention by practitioners improved student performance, it was found that ongoing support was required to ensure continued progress. The following sections will discuss this work in greater detail.
In recent years there has been a growing recognition of the importance of information literacy. Two high profile examples of such recognition are the Executive Order establishing a California ICT Digital Literacy Leadership Council and an ICT Digital Advisory Committee posted by US California Governor Arnold Schwarzenneger (May 2009) and the Presidential Proclamation on the National Information Literacy Awareness Month, issued in October 2009 (http://www.whitehouse.gov/the_press_office/presidential-proclamation-national-information-literacy-awareness-month/).
The concept of ‘information literacy’ was first introduced in 1974 by Paul Zurkowski, president of the US Information Industry Association, in a proposal submitted to the National Commission on Libraries and Information Science (NCLIS). He recommended that a national programme be established to achieve universal information literacy within the next decade. According to Zurkowski: ‘People trained in the application of information resources to their work can be called information literates. They have learned techniques and skills for utilizing the wide range of information tools as well as primary sources in moulding information solutions to their problems’ (Behrens, 1994; Bruce, 1997). In this definition Zurkowski suggested that information resources are applied in a work situation; techniques and skills are needed for using information tools and primary sources; and information is used in problem solving (Behrens, 1994: 310).
During recent years discussions about the terms information literacy and information skills, and the nature of the concepts have intensified in the UK. There are different approaches which are demonstrated by the use of differing terms such as ‘information literacy’ and ‘information skills’ and many definitions have been suggested by several organisations, institutions and authors (Virkus, 2003). Researchers of the UK’s Joint Information Systems Committee (JISC) funded ‘The Big Blue’ project, led by the Manchester Metropolitan University, and the University of Leeds found that in many instances both terms are used to describe what is essentially the same concept: ‘information literacy’ and ‘information skills’ can be described as synonyms (The Big Blue, 2002). Stubbings and Brine (2003) also note that at Loughborough University the phrases ‘information literacy’ and ‘information skills’ are both used to convey the same meaning. The Glossary of Information Terms at the British Open University (OU) Library site seems to support the same approach giving the following definition of information literacy: ‘a skill that involves being able to use information successfully, including finding information, searching using various tools (e.g., Internet, databases) and being able to critically evaluate the results’ (OU, 2003; Virkus, 2003).
Hepworth (2000a, 2000b) highlights two main approaches to information literacy that are evident: (1) attempts to identify discrete skills and attitudes that can be learnt and measured, for example Doyle (1992), the Information Literacy Competency Standards for Higher Education (ACRL, 2000) and the SCONUL approach (SCONUL, 1999); (2) emphasis on the information literate mindset associated with how an individual experiences and makes sense of his/her world, for example the work of Bruce (1997) illustrates this approach and is described as the behavioural, constructivist and relational approaches to information literacy (Virkus, 2003).
In the UK two critical definitions have been presented, by SCONUL and by CILIP. The broadly-based definition of information skills in higher education of the Society of College, National and University Libraries (SCONUL) Information Skills Task Force (now the SCONUL Advisory Committee on information literacy (Alvestrand, 2003)) reflects the twin dimensions of the ‘competent information user’ at the base level and the ‘information-literate person’. For the latter level of information skills, the term ‘information literacy’ is used. Therefore, both information skills and information technology (IT) skills are seen as essential parts of the wider concept of information literacy. For the development of the information-literate person SCONUL proposes seven sets of skills. The outline model of information skills generated in the briefing paper has become known as the Seven Pillars Model. The pillars show an iterative process whereby information users progress through competency to expertise by practising the skills (SCONUL, 1999; Bainton, 2001).
In 2003 the Information Literacy Executive met to agree a definition of information literacy for use by CILIP (the Chartered Institute of Library and Information Professionals) members. The definition was approved by the CILIP Council in December 2004 as CILIP’s definition on information literacy:
The CILIP Community Services Group (CSG) sub-group for information literacy acts as advocates and facilitators for the development of information literacy awareness and education within the UK and beyond through committee work, the LILAC conference (http://lilacconference.com) and their website (http://www.informationliteracy.org.uk/).
While there have been numerous national initiatives to address information literacy it remains a key challenge as to how we measure or assess an individual’s level of information literacy, and any progress that may be made to improvement of an individual’s information literacy skills. The research presented here used the US ILT test and identified that specific areas for concern could be highlighted and thus targeted for intervention and that levels of information literacy varied across the three year longitudinal study, at the same time recognising that the American bias in the style of the questions and their focus might influence the understanding and subsequent performance of UK students taking the test.
The ILT is an online psychometric test comprising some sixty-five multiple-choice questions, of which sixty are scored, assessing a range of information literacy Competences. Design and validation of the test was created by psychologist Steven Wise and question content was created by Lynne Cameron, each heading up a team of associated contributors (Wise and Yang, 2003; Wise et al., 2005).
A successful consortium bid to create a ‘One Stop Shop’ for resources for learner development in HE resulted in the creation of the LearnHigher Centre for Excellence in Teaching and Learning (CETL) in January 2005 (www.learnhigher.ac.uk). LearnHigher received funding as one of 74 Centres for Excellence in Teaching and Learning created by HEFCE as part of their learning and teaching enhancement strategy (http://www.hefce.ac.uk/learning/tinits/cetl/).
Led by Liverpool Hope University, LearnHigher was the largest collaborative CETL with partners from 16 institutions. Each partner committed to improving student learning by providing resources to support learning development and, through practice-led research, to evaluate the effective use of those resources. LearnHigher aimed to create a network of expertise seeking to enhance professional practice and student learning, and to build capacity both within the network and across the wider sector.
LearnHigher members sought to identify, map and label the key issues and topics of learner development. Agreement on what the ‘learning areas’ should be, how they should be supported and what outcomes were expected was considered critical from the outset of the project. The learning areas below were chosen in the light of relevance to the project aims, expertise of the individuals and institutions taking part and consortium discussion. Most of the HEIs involved in the LearnHigher CETL were allocated one learning area; a small number had two. Manchester Metropolitan University is working in the learning area for information literacy (Glass, 2006, 2007a, b). There were initially 19 learning areas in all:
The information literacy learning area at MMU, led by Bob Glass, was administered via a team made up of academics, library practitioners, learning support advisers and research associates. Through this team a large number of activities and resources were created and uploaded to the Information Literacy Learning Area of the LearnHigher Website (http://learnhigher.ac.uk/Staff/Information-literacy.html).
A requirement of the project brief was that each learning area should contribute evidence of their research activities and disseminate the outcomes. As part of the research contribution to the Information Literacy Research Area it was decided to run an information literacy audit for all year 1 students in the Department of Information & Communications, MMU, in late 2006. This project was also supported by a small grant from the Learning & Teaching Group in the Humanities, Law and Social Sciences faculty, plus the department of Information & Communications. Additionally it was decided to include around 20 students from another department in the faculty in order to generate comparative data.
The Information Literacy Test that we used was the ACRL-based (ACRL, 2000), James Madison University (JMU), ‘Online Information Literacy Test’ (ILT). There were a number of reasons for this including the nature of the test, the similarities in the standards and the availability of a ‘product’ that was ready to use. Most of the testing took place during December 2006 and February 2007. Seventy-five students in the common undergraduate programme of the department of Information & Communications and twenty from the Economics department in the same faculty were tested. The psychometric test (which is charged for on a per student basis) is based on 65 multiple choice questions. Sixty of the questions are static, five are used as ‘practice’ questions for development purposes and are varied as required by the test developers. It takes between 60 and 75 minutes for students to complete the test, depending on the speed of the student taking the test. The test measures performance in four of the five information literacy competencies identified in US-formulated ACRL standards (ACRL, 2000); written abilities cannot really be addressed by this kind of test. Students receive their score immediately at the end of the test, and tutors are provided with an extensive range of statistics relating to the student performance, question scores and overall test results. The data file is provided in Access, Excel, SPSS or other formats. We undertook our analysis using SPSS as this was the most convenient format to use at MMU. Previous results have been presented by Glass and Griffiths (2008, 2009, 2010).
In year 1, students undertook the test within computer teaching labs within the Department of Information & Communications during one-hour ‘seminar’ sessions (allowing for ‘over run’ time if necessary). The sessions were formal, supervised and run as part of a year 1 compulsory unit (‘Information Literacies for the Digital Age’), facilitated by Bob Glass and Chris Dawson, both departmental tutors.
The ACRL standards assess information literacy competencies using five measures. However, one of the standards (Standard 4: the student is able to use information effectively to accomplish a specific purpose – i.e. writing an essay), cannot be measured using a multiple-choice item format and was excluded from these assessments. Therefore the four standards assessed via the ILT are:
Two performance level standards have been defined by the ACRL ILT creators, these are Proficient and Advanced. A score of 65–89 per cent (39–53 out of 60) is required for a student to be assessed as Proficient, and a score of 90 per cent + (54 + out of 60) for a student to be assessed as Advanced. A Proficient student will be able to:
The mean score of students across the three years of study are shown below, demonstrating some improvement in the information literacy of the students over the period of research (see Fig. 10.1).
The mean score of students in each year of study fell just short of the score required to be Proficient, with a peak score of 64.29 noted in year 2. From field observations it was noted that the participating sample was the most focused and committed of the students and therefore the overall score was biased towards the more capable students. By year 3 a broader spread of participants were again recruited. However, overall improvement was observed from years 1 to 3.
Results across these four standards show that the majority of students (see Fig. 10.2):
were able to answer questions concerning evaluating information and its sources critically and being able to incorporate selected information into his or her knowledge base and value system correctly: ACRL3 (range of 61 per cent–65 per cent of students scoring correctly);
were able to answer questions concerning understanding many of the ethical, legal and socio-economic issues surrounding information and information technology correctly: ACRL5 (65 per cent − 73 per cent of students scoring correctly).
From these results it would seem that the majority of students have little or no difficulty identifying the information they need, are able to evaluate the information and sources and incorporate the information into their knowledge and understand some of the legal and ethical issues surrounding use of information. However, some areas are causing difficulties and the results of these tests show that students in year 3 are scoring slightly lower across each standard, these will be presented in detail below.
Two thirds of students encountered little difficulty in defining and articulating the nature and extent of the information they needed, and an improvement was apparent from years 1 to 3. However, there was a slight drop between years 2 and 3. Detailed analysis identified that year 3 students struggled to identify correctly where and what primary sources are (44 per cent incorrect).
Some students also appear to struggle with how they go about accessing that information (see Fig. 10.4). There skills are core to our profession, critical in information literacy and an area where the library can, and does, provide excellent training; and this training can be further directed and targeted by understanding the results of this test. In year 3 the following areas caused particular difficulties:
Results for ACRL standard 5 – understands many of the ethical, legal and socioeconomic issues surrounding information and information technology
Certainly, these results provide detailed identification of areas for practitioners to target to improve information literacy – results which we hope the community will find useful. However, in conducting this research other findings have emerged, such that it is posited that:
Having used a US-based test designed for use in a US context, it is also felt that this may cause bias and/or confusion among UK students. If a UK test were developed then this might prove to be an effective diagnostic tool for students, tutors and library practitioners. Discussions with the LearnHigher Information Literacy Learning Area and the CILIP CSG for Information Literacy has led to initial work to create a UK Information Literacy Question Bank. It is envisaged that this Question Bank may be used by different people in different ways, for example online psychometric tests, printed quizzes or web-based interactive tutorials. This work is now in progress and it is hoped that it will provide the UK with a method to ensure appropriate assessment of information literacy to enable meaningful intervention for students.
The Big Blue: Final Report. 2002. http://www.library.mmu.ac.uk/bigblue/pdf/finalreportful.pdf
Glass, B., Griffi ths, J.R.Understanding the Information Literacy Levels of Students: The Results of a Three Year Online Information Literacy Audit at Manchester Metropolitan University. Limerick, Republic of Ireland: Limerick Strand Hotel, 2010. [LILAC 2010, 29–31 March,].
Glass, N.R., LearnHigher CETL: Information Literacy. The LearnHigher Suite at MMU. 2006. LearnHigher: www.learnhigher.mmu.ac.uk/learnhigher-suite/
Glass, N.R., LearnHigher CETL: Information Literacy. The Project. 2007. LearnHigher: www.learnhigher.mmu.ac.uk/project/
Glass, N.R., LearnHigher CETL: Information Literacy. Resources. 2007. LearnHigher: www.learnhigher.mmu.ac.uk/resources/
Hepworth, M., Developing information literacy programs in SingaporeBruce, C.S., Candy, P.C., eds. Information Literacy around the World: Advances in Programs and Research. Charles Sturt University, Wagga Wagga, NSW, 2000:51–65.
Open University Library, Glossary of information terms. Open University, Milton Keynes, 2003. http://library.open.ac.uk/help/helpsheets/intglossary.html
Stubbings, R., Brine, A., Reviewing electronic information literacy training packages. Innovations in Teaching and Learning in Information and Computer Sciences (ITALICS). 2003;2(1). http://www.ics.ltsn.ac.uk/pub/italics/issuel/stubbings/010.html
Virkus, S., Information literacy in Europe: a literature review. Information Research. 2003;8(4). Paper No. 159,. http://informationr.net/ir/8-4/paper159.html [(accessed 2 October 2005).].
Wise, S.L., Cameron, L., Yang, S., Davis, S. Information Literacy Test: Test Development and Administration Manual. Harrisonburg: Institute for Computer-Based Assessment; Center for Assessment & Research Studies, James Madison University; 2005.