The term information literacy (IL) appeared in studies published back in the 1990s within the world of libraries, but it did not start to be widely used until ten years later. Cuevas (2007: 25) defines the concept as ‘education in a set of competencies and skills related to the access, to use and evaluation of information’. Information literacy is closely related to digital literacy and is a key element in the networked society, the knowledge society and the society of lifelong learning. In a document entitled Towards Information Literacy Indicators published by UNESCO, Catts and Lau (2008) suggest another definition of the term. They claim that information literacy ‘is the capacity of people to recognise their information needs, locate and evaluate the quality of information, store and retrieve information, make effective and ethical use of information, and apply information to create and communicate knowledge’.
Marti et al. (2008) list the different objectives pursued by information literacy:
To develop the skills to construct and implement an institutional programme of constant updating of knowledge on how to use the information and communication technologies (ICTs) and on methodologies for accessing and managing information.
To reinforce individual and institutional ICT competences, as well as methodologies for accessing information and managing knowledge and aiding organisational and life change within the Information Society.
To acquire new working habits in which the capacity to analyse complex situations predominates; to identify, analyse and solve problems; to plan, organise and critically evaluate extraordinary work situations.
IL is taking shape as an evolution in library patron training, a service that university libraries have been offering since the 1970s whose goal was to show and teach users how to use all the information services and resources they made available. Today, training at libraries goes a step further in that it is necessary not only to educate patrons in the use of sources and other information resources but also to give them the competences and skills needed to learn how to locate, choose and evaluate all kinds of information they might need over the course of their lifetimes. The document published by the ACRL (Association of College & Research Libraries) in 2000 entitled Information Literacy Competency Standards for Higher Education suggests a series of competences, divided into five standards and 22 performance indicators, which outline the process by which faculty and librarians identify a student as information literate. The standards focus on the needs of students in higher education at all levels and list a range of outcomes for assessing student progress towards information literacy:
In turn, Bainton (2001) presents an information skills model which attempts to illustrate the relationships between the ‘competent information user’ at the base level and the much more advanced idea of information literacy. The ‘pillars’ show an iterative process whereby information users progress through competency to expertise by practising the skills (see Figure 5.1).
Many international bodies have taken an interest in information literacy in university libraries. Proof of this is documents like the Prague Declaration: Towards an Information Literate Society1 (2003) and the Alexandria Proclamation on Information Literacy and Lifelong Learning2 (2005), which stress the importance of libraries and information services as providers of the tools needed for citizens’ information literacy. Likewise, International Federation of Library Associations and Institutions (IFLA) has an Information Literacy Section from 2002 that focuses on all aspects of information literacy, including user education, learning styles, the use of computers and media in teaching and learning, networked resources, partnerships with teaching faculty in the development of instructional programmes, distance education and the training of librarians in teaching information and technical skills. It has issued two documents to guide libraries and information centres: Guidelines for Information Literacy Assessment3 dating from 2004, and Guidelines on Information Literacy for Lifelong Learning4 dating from 2006.
The Information Literacy Coordinating Committee of the Association of College & Research Libraries (ACRL) has developed a variety of information literacy tools. One example is its website,5 which is framed as a gateway to resources on information literacy helping users understand and apply the Information Literacy Competency Standards for Higher Education to enhance teaching, learning and research in the higher education community. Plus, it also offers a series of resources and ideas that facilitate library professionals’ jobs, including a glossary of related terms, a standards toolkit, information on assessment issues and bibliographies.
Likewise, the main goal of the Society of College, National and University Libraries (SCONUL) and its Working Group on Information Literacy is to ensure that the role of information literacy in learning and teaching, research and organisational enhancement is communicated effectively and understood by the wider educational professional groups in higher education. In 1999, the Working Group drew up a baseline document entitled The Seven Pillars of Information Literacy,6 which aims to define what was meant by ‘information skills’; to articulate why information skills are important for higher education students; to assess the size and scope of current activity in higher education in the UK and Ireland and to identify principles of good practice in this area.
Projects worth noting in Spain include ALFIN-EEES,7 a pilot initiative being coordinated by a professor from the University of Granada, Maria Pinto. Its mission is to serve as proposed contents for the main generic competences related to information literacy, which are valid for any university student who needs to look for, manage, organise and evaluate information gathered from a wide variety of sources. Another worthwhile project is the Alfin Red forum,8 a virtual community for studying, researching, promoting and implementing information literacy services.
Strategic planning and goal-based management are practices that have been in use at university libraries for many years. It is important for each institution to be aware of its priorities and to plan which jobs and services it must perform and what resources it has to perform them. Usually all of these predictions come together in a document called a strategic plan, which serves professionals as a procedure book for organising jobs within the service. Strategic plans at university libraries have a similar structure and tend to be organised into three parts:
‘Incorporate fundamental research skills and concepts into the first-year experience and general education and continue to build on these skills through the major to ensure that all graduates are competent in their ability to use information resources in their chosen career and are prepared for lifelong learning’ (Strategic Plan 2007–2012 of the Meriam Library, California State University, Chico).
‘Improve the quality of student learning through improved teaching practices. Contribute to improved information literacy teaching practice within flexible learning environments to enhance graduate capabilities’ (Library Strategic Plan 20092012 of La Trobe University, Australia).
Other libraries have even drawn up specific information literacy plans, such as the 2005 Information Literacy Strategic Plan of Louisiana State University, which was revised in 2009. It cites the following objectives:
If we want information literacy to play a prominent role in libraries, it must be included as a specific strand within the library’s strategic plan. It is also wise to draw up a specific programme that includes all the aspects that must be reviewed before beginning training, as listed below:
1. Mission, goals and objectives, which include a definition of information literacy, reflect sound pedagogical practice, articulate its integration across the curriculum, accommodate student growth in skills and understanding throughout the college years, etc. (ACRL, 2003).
2. Drawing up a calendar that provides a detailed description of the type, levels and models of training that are going to be taught at the library and the tools that will be used. Information literacy programmes at university libraries are usually designed with both a face-to-face and a virtual component in mind. As we shall explain in other chapters, many universities are using e-learning platforms to teach their distance courses. Libraries are now taking advantage of these systems to launch virtual information literacy programmes, usually with the support of face-to-face teaching.
Training at university libraries is mainly targeted at all patrons that use the service, including: students at all levels (bachelor’s, master’s and doctoral programmes, exchange students, etc.), teaching and research staff (full professors, associate professors, adjunct professors, visiting professors, grant awardees, associates, etc.) and administrative and services staff (administrative assistants, scholars, economists and finance experts, IT workers, student support staff, university extension, etc.).
The tools libraries use to provide face-to-face training have not changed much over time: courses, guided tours, practice sessions, handbooks and presentations are the activities that predominate in the vast majority of programmes. However, virtual training requires another kind of resource that is compatible with e-learning platforms and makes learning easier. The 2.0 applications which we discuss in one of the sections of this chapters play an important role in distance learning. They include blogs, wikis, social networks, shared files, etc.
3. Seeking alliances and cooperation with other departments to facilitate the institutional recognition of this training is another factor to bear in mind within the information literacy programme. It is crucial to seek alliances at all echelons within the university: the academic organisation (chancellor, vice-chancellors, faculties and schools, departments and areas), the administrative organisation (management, human resources, etc.) and last but not least, students through the student delegates and representatives. Each group can support the inclusion of information literacy in their curricula and training programmes for the entire university community or the improvement of infrastructures for training, both face-to- face and virtual. Specifically, students have to be convinced that lifelong learning and training in the access to, evaluation and use of information is necessary for their studies; that what they learn at the university will later serve them to join an increasingly competitive job market. Even though this is an accepted reality, no matter how much the library tries to raise their awareness, the power of persuasion truly lies in the teachers in their classrooms.
4. In turn, librarians must show a particularly positive attitude towards the new challenges facing them. Training is one function of the library that must be transformed into a patron service, meaning that information professionals must take on the role of teacher or trainer. Plus, a sound understanding of library resources adds value to them, so it is worth ensuring that all the staff working at the library gets involved in information literacy, each at his or her own level.
5. Assessment and evaluation of information literacy programmes, using performance indicators, is necessary in order to ascertain the positive or negative development of the process and detect errors that can be solved on later occasions. According to Lau (2006), there are different assessment methods to support students throughout the information literacy learning process. Here are the primary recommended tools (Figure 5.2):
Checklists. These are lists to guide students in the accomplishment of their assignments and they include the different stages, levels or items necessary to complete the assignment. Checklists should be visual task reminders to improve student growth and should be provided at the beginning of the assignment so that they can be used during the whole learning project or task for self-feedback.
Rubrics are a precisely structured assessment that guides students to achieve a successful performance. They normally include a graded list of the attributes students ought to perform in their learning tasks. The rubric can be divided according to the process steps with a clear indication of each element to be considered to reach the desired goal.
Conferencing is a technique that is based on a discussion with the learner, among learners or among the whole class to orally reflect on the information literacy processes. It can be done at the different stages of the information tasks, as well as at the end of the process. It uses questions posed by the facilitator inquiring about the process of learning.
Portfolios consist of accumulating student work over time and integrating it into a final package of IL process products. Portfolios are useful assessment techniques because they give students the chance to see their learning products become integrated into a final product. They are an excellent way to measure the efficiency of attaining the learning goals and evaluate the effectiveness of learning strategies and the clarity of knowledge presentation.
Reports are useful essay exercises as long as they are not cut-and-paste exercises or a repetition of the information in printed or electronic sources with little synthesis or no evaluation of the retrieved information.
Traditional tests or lists of questions with open-ended or structured answer options are also useful. Tests can be used when time is limited or when the assessment is specifically focused on a certain aspect of learning.
In order for a university library to be successful in its mission to make its patrons information literate, it must properly plan its range of courses and other activities over the academic year. The alliances that have been reached with the different stakeholders within the institution will be utilised to ensure that the programme is fully included within the curricula. This is so that the library can provide ongoing compulsory training in all degree programmes as yet another class, and that the information literacy courses and activities are duly recognised and reach as many users as possible.9
It is also important to plan the information literacy programme by areas of knowledge. Students, faculty and administrative staff who are studying and working in similar fields usually have similar needs, so offering training targeted at general fields of knowledge is more effective.
Finally, we recommend that you organise the training for students by educational levels or years. Students who are in their first year of university often lack specific information knowledge, while those at higher levels usually have more advanced skills.
To empower patrons to use the tools needed to ensure that they learn how to deal with, locate, select and disseminate information while taking advantage of its usefulness for other aspects of their life, both professional and personal.
‘Reserve a librarian’10
With the launch of arXiv11 in 1991 by Paul Ginsparg, a new stage in the academic and research world opened. This is the first e-prints12 repository on high energy physics, mathematics and computer sciences, created in Los Alamos Laboratory in the United States and which can be accessed freely through the Internet. This was followed by other open digital archives, such as RePEc13 (Research Papers in Economics) and Cogprints14 (Cognitive Sciences Eprints Archive), both of them created in 1997 with the same purpose as the first one: storing and managing digital contents, originated in the research, regarding two or more subject areas, in order that they may be consulted, openly, through the Internet. Twenty years later, we can talk about the total establishment of these services, so much so that the first positions of the Ranking Web of World Repositories (July 2009)15 are occupied by subject digital deposits.
The EPrints16 platform, developed by a team of researchers of the School of Electronics and Computer Science of the University of Southampton in 2000, is used to implement the first institutional repository, which in contrast to the subject one, includes research from a specific organisation. With the passing of time, new products have been developed for the creation of open digital archives and since the appearance of the first deposits organised by subjects, the proliferation of this type of services has been massive in universities and institutions targeted at the scholarly activity. Nowadays they have become one of the most innovative services in the academic libraries and almost compulsory in all the entities devoted to research.
a suite of services that an institution offers to its community for the management and dissemination of the digital contents generated by the members of that community. At its most basic level, it is an organizational commitment for the control of those digital materials, including its preservation, arrangement, access and distribution.
fundamentally the vehicle to reach the open access, it is more useful if it is interoperable, if it envisages the long-term preservation of its contents and is accompanied by an effective institutional policy. It is also a living entity and the academic image within the organization.17
It is worth highlighting two of the ideas suggested by Suber in this last definition. On one hand, the substantial competence that the author grants to the institutional repositories is combined with the idea of the Budapest Declaration in 2002, one of the bases of the open access movement in the world, which establishes two routes to reach this type of access: the golden road, which proposes publication in open journals and the green one, which refers to the deposit of digital materials in institutional or subject repositories (self-archive). And on the other hand, the fact that these services are backed up and supported by policies duly defined by the government team of the organisation that creates it must be a priority condition to be considered before their development. However, in most of the institutions, the supporting policies of Open Access have been emerging in parallel to the implementation of their repositories and do not overtake them.
The boost that universities have given to the repositories has been unequal. Some of the universities have established the mandate to self-archive all the research generated and subsidised by the entity itself in the institution repository. In theory, this first measure should be the most effective one as all the scientific information that was generated at the university would be quickly accessible to everybody through the Internet and in this way the increase of contents in the repositories and consequently their progress would be guaranteed. But not all the authors follow this trend and think in the same way. Whereas Harnad states the need for mandates in order to maximise the growth and to guarantee the success of the institutional repositories, McGovern affirms that other criteria favour their consolidation, including the use that may be made of their contents and the financial and technical support which they obtain. He also points out that the mandates may give rise to more problems than benefits (Harnad and McGovern, 2009).
ROARMAP18 (Registry of Open Access Repository Material Archiving Policies) is a record that exists on a worldwide scale about self-archiving policies (see Figure 5.3). At present,19 it includes 139 mandates and 15 proposals, most of them coming from universities and agencies that finance the research. Among the most relevant and recent mandates, we may cite the mandate of the North American National Institutes of Health20 (NIH), in which, with the approval by the US Senate of the Act FY2008 ‘Labor, Health and Human Services and Education and Related Agencies Appropriations’ the researchers who receive subsidies from them have a duty to deposit copies of their texts in PubMedCentral21 before one year has elapsed since publication in a journal with peer review. These new regulations break away from their previous policy of voluntary deposit, approved in 2005 and in force until 2008.
In the University of Harvard,22 the Faculty of Arts and Sciences and the School of Law are the institutions that initiated these types of mandates in 2008, specifying that the teachers who approve, by a large majority, these mandates have the obligation of depositing all their research in the institutional repository of the university. They are followed by the School of Education and the John F. Kennedy School, which also have adopted a self-archiving mandate and, finally, the School of Medicine, which has already registered its proposal.
The MIT (Massachusetts Institute of Technology)23 launched its mandate in March 2009, with the unanimous approval of the teaching staff, which commits to send to a representative of the principal’s office the final version of its works in electronic format and free of charge. That office will be in charge of placing this digital material in the institutional repository.
In Spain, the Regional Community of Madrid, in its role as financial agency, approved, in February 2009, the compulsory requirement according to which the research groups that receive financial aid for R&D programmes must self-archive a copy of their published works or the final version of them in the institutional repository available for this purpose in its university, public research agency and/or the repository independent from the Community of Madrid, within six months for the technology and bioscience areas and twelve months for social sciences and humanities, of the publication of those works.
The reaction of the authors to the mandate of their institution varies according to the source consulted. According to a study made by Swan and Brown (2005) of the views of the researchers on open access, 95 per cent of the teachers who were asked answered that they would have no objection to self-archiving their works in the repository of their university, if the institution required it. However, an earlier study, written by Rowlands et al. (2004) shows that 38 per cent of the interviewed authors would resist the order or mandate and would not accept compulsory self-archiving. Although the results of these surveys may be somewhat contradictory, they seem satisfactory, noting a positive change in attitude to the open access initiative on the part of the authors; in a short period of time, all the surveys coincide in the fact that the scientist has two viewpoints, that from the author and that from the reader and his/her attitude varies according to the fact that it fulfils one or other function. On one hand, the researchers/authors resist self-archiving their works due to the quantity of barriers that they find and on the other hand, the researchers/readers are in favour of self-archiving, because they wish the scientific communication process to be fluid.
As we have already discussed, all the university repositories obtain institutional support at a different level. Up to now we have talked about the mandate and about how many universities adopt this rather strict type of open access policy. But it is not always like this, other options exist, intermediate measures, less extreme, that foster and promote the existence of the repositories, their support and evolution and we group these below:
Text signature and declarations in favour of Open Access: such as, for example, Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities.24 This is a document signed in October 2003, whose mission is ‘support new possibilities of knowledge dissemination, not only through the classical manner, but also using the paradigm of open access by means of the Internet’. They define open access as ‘a wide source of human knowledge and cultural heritage approved by the scholarly community. In order to achieve the view of a global and accessible knowledge representation, the web in the future has to be sustainable, interactive and transparent. The content and the software tools must be freely accessible and compatible’. In addition, they advise the researchers ‘to deposit a copy of all their published articles in an open access repository and encourage them to publish these articles in open access journals’. So far25 266 institutions have signed the Declaration and, therefore, recommend open access as a scientific communication route.
Another very recent example is Compact for Open- Access Publishing Equity (COPE)26 from September 2009. The first signatories of this document were five North American universities: Cornell University, Dartmouth University, Harvard University, MIT and University of California at Berkeley, which recognise a ‘crucial value of the services provided by scholarly publishers, the desirability of open access to the scholarly literature, and the need for a stable source of funding for publishers who choose to provide open access to their journals’ contents’ and commit to ‘timely establishment of durable mechanisms for underwriting reasonable publication charges for articles written by its faculty and published in fee-based open-access journals and for which other institutions would not be expected to provide funds’.
Definition of repositories internal policies includes four aspects: the use of data and metadata; their content and characteristics, as document typologies to be included and definitions; self-archive procedures, authorised access, embargos, etc; and, finally, the preservation and processes for the long-term conservation of materials. There are a few repositories registered in ROAR and in OpenDOAR27 which have established this type of policy, but, to cite an example of good practice, that of the University of Nottingham28 has defined perfectly all these aspects related to the internal operation of its service.
Economic funding on the part of the universities for the opening of results of scientific research. Therefore, the University of California at Berkeley started in January 2008 a project called ‘Berkeley Research Impact Initiative’29 which tried to cover the publishing costs in open access of its authors. In this way, the teaching staff, post-doctorate and graduate students could request financial aid of up to $3,000 to cover the publishing costs of an article in open access, apart from covering the expenses (to $1,500) of ‘hybrid publications’, in which the information is accessible free of charge, but the journals limit the redistribution rights. The pilot programme was conceived to last 18 months or until the allocated funds of $125,000 run out. For their part, two other North American universities, University of North Carolina-Chapel Hill and University of Wisconsin-Madison, following the trend of California, launched two analogous projects with objectives similar to the first one.30
The creation of specialised services, from which assistance, guidance and support are offered to the teaching and research staff from each university, above all the aspects related to the open access: self-archive, copyrights, commercial publisher policies, etc. In June 2009, the team of SHERPA,31 from the University of Nottingham, inaugurated its Centre for Research Communication (CRC), which houses the services, initiatives and projects in open access that currently are being implemented in that university, among which we can include the SHERPA Association, the RoMEO Open Access services, Juliet, OpenDOAR, the Repositories Support Project service and the contribution of the University to the DRIVER, Dart-Europe and NEOCBELAC projects. Bill Hubbard, who is responsible for the project, defines the objectives of the service: ‘we aim to develop innovative research and development activities across the whole field of research communication. This is an exciting time for authors and researchers. We are beginning to leave behind straightforward electronic analogues of our centuries-old print world and realise the possibilities of new and far richer forms of scholarly communication’.
Therefore, despite the universities recommending self-archiving through all these measures, there are many barriers that are imposed on the authors in the process of depositing electronic copies of their works in the repositories of their institution. Most of them resist self-archiving their works simply due to ignorance and they are not aware of the advantages that it may contribute to their careers. Other people are frightened by the lack of control that may follow their articles being deposited (plagiarism, conflicts of interest, etc.). To these reasons we can add: lack of time, lack of familiarity with the information technologies, resistance to change, lack of motivation, objection to sharing results, etc.
Jones (2007) identifies different stakeholders for the university repositories and each of them will have different responsibilities and tasks within it. This author groups them in four categories which we list below:
1. End users. In the first instance, the members of the university community who use the library to meet their communication needs. Teachers, researchers, students and other staff, who work at the university, form this first group of stakeholders, although we can also talk about end users from other institutions, as the essence of any repository is that it is openly and, therefore, freely accessible to everyone through the Internet.
2. Information providers. Within this group, Jones draws together the authors, peer reviewers, publishers and libraries-information services. Authors are indispensable for the creation of the repository contents, as they produce and spread the results of their projects and researches. Within a university, they are the members of their community, that is to say, teachers, researchers, students and another administration staff the main recipients of this information, although as we have already mentioned, depending on the role adopted, they act as information providers or as end users.
At the heart of scientific publications, peer review is a critical assessment that experts who are not part of the editorial team of the journal make of the manuscripts that reach them in order, among other things, that works of poor quality are not published, that the research results are correctly interpreted and that texts are selected in relation to the readers’ interests (Hames, 2007). In general, the university repositories do not include this type of review, but most of them differentiate the works that have been subject to a peer review process from those that have not. The fact that the contents which are deposited in the university repositories are not being controlled by an experts committee, as in a scientific journal, is another of the problems that many researchers put forward in order to self-archive their research in them. As a proposal for the future, it would be appropriate to suggest some verification system within the university repositories, because nowadays this type of filter is necessary for the research assessment and for the consequent recognition of the scientific author.
Publishers are considered another important stakeholder in the scope of the university repositories. The functions that they fulfil in the traditional scholarly communication system is what Scovill (1995) has summarised in three sections: publish and produce, where we can include themes regarding the quality control of the publications (not only the content, but also the language or style), of the format or design of them and of the document treatment of the articles (indexation and abstract), legal and financial aspects, where we can include all the procedures related to the contracts with authors, management of the copyrights of articles already published, etc. and marketing through which all the products that the publisher markets will be released. But in the context of digital edition, the role of the publishers differs and, according to Jones (2007) would be to establish their position in relation to the type of content that they would like to see deposited in the open access digital archives. In addition, and according to the author, all these changes that have come up in the process of scientific communication are having a serious effect on the business of all the commercial publishers, although they have been responsible for much of this transformation.
Libraries and information centres are the last stakeholder which is an information provider. Several authors agree on the fact that the university libraries contribute actively to the evolution of the scientific communication and play an essential role in the establishment of institutional repositories. Their development has allowed the librarians to put into practice technological capacities and skills, unknown in the past by the users, so that their professional field gains greater relevance in the academic world. Librarians acquire and create electronic resources, while they make the content of them accessible more easily. They are experts in the areas of communication, preservation, metadata management, promotion and dissemination (Crow, 2002; Horwood et al., 2004; Read, 2008). Moreover, it is very common that in a university the library is the entity that is in charge of the directing and coordination of the institutional repository project, so that the librarians are those who adopt the role of manager of those repositories. But not all the authors think in the same way. According to Brown and Swan (2007), the repositories management is one of the seven more important roles which an information professional in an academic library plays, and 61 per cent of the researchers think that it will be an essential activity that the librarians will have to develop in the next five years. Or we could even take into account the opinion of Ottaviani, who considers that leading the repository projects in the universities is a responsibility of the librarians, as they show capacities and skills, not unique, but very adequate for a task like that (Ottaviani and Hank, 2009). However, there are opposite views, such as that of Hank, who maintains that the leadership must not be exclusive to the librarians and that they must share it, because it is necessary for the collaborative work with the other stakeholders in order to build up a good repository (Ottaviani and Hank, 2009). Hernández Pérez et al. (2007) are categorical when stating the benefits that are perceived at the academic libraries when these adopt a leadership role in the institutional repositories: greater recognition on the part of the research community because they can offer new and better services, such as reports about quotations and document downloads from the institution or supply of data necessary for the evaluation of the research community.
3. Information mediators. This third group is made up of other services which also are information suppliers, but in an indirect way, such as the aggregators and the academic search engines. The function of these information intermediaries is that of gathering data from the repositories, through only one interface, regardless of the computer program and the technical characteristics that each of them may offer.32
The aggregators33 and the repositories are interoperable thanks to the Open Archives Initiative (OAI),34 which should not be confused with the acronyms of the Open Access (OA). Taking into account the definition by Suber (2007), the OAI, initiated in 1999, defines a protocol, the OAI-PMH (Open Archives Initiative-Protocol for Metadata Harvesting), to collect metadata from data archives which are in separate archives. When the protocol is used by data services as search engines, these may process the data from separate archives as if they were in only one. In technical terms, the protocol for metadata collection supports the interoperability. In view of the good result obtained by the creation of institutional repositories and the exchange of information among scientists, the OAI has undertaken a new service called OAI-ORE (Open Archives Initiative-Object Reuse and Exchange)35 (v. 1.0 17 October 2008), with the main objective of developing specifications that will allow the repositories to exchange web resources aggregations that set up information logical units (complex digital objects). Following the description that Rumsey and O’Steen (2008) give of OAI-ORE, it is a completely independent standard from OAI-PMH, and neither broadens it nor replaces it, but is focused mainly on the content, unlike OAI-PMH, whose objective is the metadata. According to this trend, it seems appropriate to highlight the project undertaken by the Texas Digital Library (TDL)36 and its federated collection of electronic theses from the different public and private institutions that form it (see Figure 5.4). The experience consisted of adding support OAI- ORE to DSpace, a platform under which the collections of academic lectures are created. Once the results are obtained, the flexibility of the architecture of the computer program is under discussion and we can verify the benefits of adding extra functionalities of harvesting aimed at simplifying maintenance tasks of federated collections.
Jones also considers the academic search engines,37 information mediators and as such, apart from recovering web pages, patents or monographs, they are able to gather every type of materials deposited in institutional repositories.
4. Meta-information users. Within this last type, the author draws together the agencies that finance research, academic institutions and national entities that support with their policies the open access initiative and the existence of repositories. To cite but one example, the UK’s Joint Information Steering Committee (JISC)38 and SHERPA,39 both British organisations, have been, from its beginning, very involved in the context of the digital repositories, through its projects, programmes40 and events celebration.41 In the Netherlands, SURF Foundation42 stands out as an organisation that, in a way similar to JISC and SHERPA, works in a very active way in that subject.43
As we have mentioned before, libraries and university information services are essential stakeholders of the institutional repositories. In most cases, it is from the library that the initiative of implementing the service comes, and the librarians are those who coordinate and lead the project and who ultimately have a great part of the responsibility, but it is very important to count on the help of other units and services of the institution. Analysing the SHERPA document, whose recent revision dates from March 2009 and in which the abilities and skills of the managers and institutional repositories are described, we notice that most qualities that are indicated in that document are typical of the librarians:
These aptitudes and others justify the importance that the librarian’s work has within a project of institutional repository in a university and his/her experience and capacity make indispensable his/her presence in this type of initiatives from two points of view: on one hand, the internal tasks, related to the creation and implementation of the repository and, on the other hand, the external work, done with the aim of offering services that facilitate the use and management of this tool. With regards to the internal activities of the library and the librarians to set in motion an institutional repository, we can point out the following ones:
1. Creation of the list of document types and formats which can be included in the repository, as the librarians are experts in the management, description and analysis of the documents and know perfectly the characteristics of each. In the studies about the situation of the institutional repositories that have been prepared, generally according to countries or geographical areas,44 the majority contents are doctoral theses and journal articles, but in a university repository it is possible to include many others, such as: work documents, work submitted to congresses, scientific reports, teaching material, multimedia objects, datasets, etc.
2. Description and analysis of typologies included in the repository, in accordance with standards of suitable metadata. One of the tasks that a librarian has always carried out is the cataloguing of any type of bibliographical materials. The internal and external description of documents, in accordance with some established rules, has been a necessary task in order that the end users have at their disposal a catalogue which lets them identify and locate the documents in any library or information service. The same happens with the contents included in an institutional repository. The metadata generically defined as ‘data about the data’ are used to identify, describe and locate the digital objects included in it. The librarian will be the person in charge of choosing the appropriate standard, and in accordance with it he/she will analyse all the contents in the repository. At present the most used metadata model is the Dublin Core,45 although most of the software platforms used for the creation of repositories admit other schemes, including PRISM,46 MODS47 and METS.48
3. Creation of a list of subjects or vocabularies for the thematic categorisation of the contents in the repository. The computer programs generally used tend to include a list of generic subjects, which do not correspond to the existing disciplines in each university. For this reason, the librarian must be responsible for the adaptation of the subjects to the contents and must customise the initial list, creating an individualised vocabulary for each institution. It is interesting to begin with a base glossary and to modify it progressively as new contents emerge and with the assistance of teachers, researchers and other authors.
1. Suggest marketing strategies, in two senses essentially: on one hand, in order to extend the service to all the members of the institution and beyond it, and on the other hand, with the view to promote self-archiving on the part of the authors. In order to meet the first objective, the most effective method is to organise workshops where the service is presented and its benefits are discussed. It is advisable to devise a schedule and to structure the sessions according to interest groups: the university government team, deans and directors of schools, teachers, scholarship holders, students, administration and service staff, etc. In order to give incentives to self-archiving, there are different strategies, such as: general promotion, the development of services, the ‘harvesting’ of contents from other systems,49 the review of the bibliographies of the researchers, the data about the use and the institutional policies (Mark and Shearer, 2006) or marketing done by the authors themselves who have already experienced the benefits of the documents deposit in a repository, such as, for example, the increase in citations (Zuccala et al., 2008).
2. Create tools for the training of all the university community in the use of the repository. The librarians will be responsible for drawing up the guides necessary for the service: on self-archiving, about copyright, on specific vocabulary, etc. In most cases, the library is the intermediary between the researchers and the service. All the doubts which could arise regarding it will be channelled to the library staff, so they must be prepared for every type of question, comment or suggestion.
Although all these tasks are in the hands of the library and its librarians, we cannot forget that every project in an institutional repository needs the assistance of other university services, such as that of the computing department, publications, legal consultancy, etc., so the institutional synergies and the cross collaboration would be aspects to take into account. If the relations among the different departments are good, then success is guaranteed and the repository will become a new vehicle to communicate the research, which will never replace the traditional methods, but will complement and improve them.
The earliest initiatives to digitalise50 libraries and information services arose primarily to replace microforms,51 which came to be widespread in American universities in around 1950. Within the library setting, this kind of material was used to store, preserve and conserve valuable documents which were difficult for patrons to consult. Nowadays, libraries all over the world have vast digital collections, and transforming printed matter to electronic contents has become a necessary process in any information unit. As a result, these services are receiving financial support from governments and companies in order to conduct massive document digitalisation projects which make it easier for patrons to access the collections via the Internet.
The benefits of digital documents are beyond question. Abby Smith (1999) suggests a few:
Their flexibility makes it easy to edit, manage, format and print an indefinite number of identical copies without the need for a paper copy. However, this quality also has its negative side, as the very plasticity of digital documents makes it easy for anyone to alter them if prior controls are not put into place.
Within the field of research, they provide access to certain teaching materials and special collections, such as rare books, manuscripts, images, etc., whose particular features make them unaffordable.
Hughes (2004), in turn, largely agrees with Smith and adds that the main advantage of digitalisation in a university library is that electronic documents foster access to all kinds of collections, to materials in all formats, and thus reach a wider audience. Furthermore, they help to preserve and protect the original documents, especially those with special physical qualities, and they contribute to collection development by merging or complementing the contents of other libraries. Hughes concludes by stating that digitalisation projects enhance an institution’s strategic benefits by conferring more prestige on it, and that they also contribute to education and research within the scientific community.
University libraries have created special units made up of teams of expert professionals primarily occupied with planning and launching digital projects. Today, these departments have become the core of the information services, as they are in charge of implementing all library innovation programmes. As seen in Figure 5.5, the libraries at Stanford University have very clearly organised their digital services and collections. What stands out first is the prominence they are given in the teaching, learning, research and lifelong education of their patrons, and secondly the easy fit between the digital and traditional collections.
Each of these digital collections is targeted at different patron groups and meets different information needs. For example, peer-reviewed publications and proceedings from conferences and colloquia are fundamental documents for the research community, so in the Stanford model they are directly related to the Scholarly Communication Service, in addition to serving as essential components of the institutional repertoire. On the other hand, the users of teaching and classroom materials are mainly students in the many academic degree programmes.
The appearance and widespread use of digital services has enhanced the practice of traditional tasks at university libraries at times, and relegated it to secondary status at others, as they coexist without any conflict whatsoever. Despite the fact that there are recent studies showing a drop in interlibrary loans as a result of the surge in electronic information (Goodier and Dean, 2004; Echeverria and Barredo, 2005; Egan, 2005), following Willett (2009), the launch of massive digitalisation projects with limited availability on the web, which we shall discuss in the paragraphs below, will lead to a reinforcement of this traditional service, as the author claims that patrons will discover a larger number of sources that interest them yet are only partially available online. Likewise, Joint (2008) proves through statistics on use at English and American libraries that the circulation or loan of printed collections is rising every year and that many of the high-quality resources are only available on paper. Therefore, he concludes that this format should not be underestimated. However, it is a fact that at times of recession, institutions’ limited monetary resources are channelled towards purchasing digital contents and restructuring the open spaces with technological facilities (Joint, 2009).
i2010: Digital Libraries: This dates from 2005 and addresses the need for digitalisation, online accessibility and the digital preservation of both the cultural heritage and scientific information. This initiative seeks to boost the efforts at encouraging European cooperation in this area, avoiding duplicated efforts, fostering the adoption of good practice and paying special attention to the efforts of national and repository libraries. All of this is expressed in specific programmes like PRESTOSPACE52 (2004–2007), whose nine million euros in co-financing proposed a set of tools for digitalising audiovisual materials, and eContentplus53 (2005–2008), which administered 60 million euros for projects aimed at improving the accessibility and use of European cultural and scientific contents with the goal of achieving interoperability among national digital collections and services.
In 2006, the Council of the European Union encouraged the member states to address problems related to digitalisation and online access to cultural material and digital conservation, following a calendar that spanned from 2007 until 2009 and was based on the following actions: reinforcing national strategies and objectives on digitalisation and digital conservation, strengthening coordination among the member states, contributing to creating a European digital library, working together to provide a global vision of progress Europe-wide, and improving the framework conditions for digitalisation and the online accessibility of cultural material and digital conservation.
Under the umbrella of the European Union’s 7 th Framework Programme,54 there are other projects underway related to the digitalisation of the cultural heritage and the coordination of libraries’ efforts in this area: DL.org – Coordination Action on Digital Library Interoperability or Best Practices, and Modelling Foundations,55 whose main objective is to ‘create a framework where the leading representatives of digital library projects can work together, discuss experiences, exchange experiences, work on the interoperability of their solutions, promote shared norms and provide the community with new directions’; and PrestoPRIME,56 whose mission is to develop strategies for the long-term preservation of digital objects and improve access to audiovisual collections. It also aims to develop a tool to convert metadata and a rights management system and record of digital fingerprints.
The United States has its own national digitalisation policies that help libraries take decisions on this issue. Worth mentioning are the 2007 ‘Draft Principles for Digitized Content’57 from the Digitization Policy Task Force of ALA’s Office for Information Technology Policy. This includes nine principles, each revolving around one aspect of digitalisation: digital libraries, digital materials, collaboration, sustainability, communication, international, education, preservation and standardisation. These country-based guidelines are joined by guidelines and handbooks of good practice for digitalisation projects which serve as institutions’ baseline and guideposts when launching programmes of this kind.58
It is important for each library to draw up its own action handbook adapted to its own characteristics and needs, in that generally speaking prior planning of a digitalisation project in any organisation is necessary to ensure a satisfactory outcome. This general digitalisation plan will then lay the groundwork for the development of digital collections at a university library, will be a highly useful tool for all staff involved and will include all the factors to be borne in mind in a digitalisation project. Although we are not going to make an in-depth analysis of the entire process of format transformation, we do wish to stress four considerations regarding the pre-digitalisation phase.
The tasks include setting realistic, achievable objectives while drafting a monetary report that includes all the expenditures the project will generate, as well as choosing the documents or collections to be digitalised. If the material resources are sufficient, the entire process is usually conducted inside the library itself. However, if the library does not have the appropriate technology, these tasks are then outsourced and a specialised company is charged with launching the project. If the library chooses the first option, it is important to forge alliances with other departments within the institution that can lend a hand and set up an action plan for each of the units involved. However, in both scenarios, the costs of digitalisation are quite high, meaning that some sort of financing must be allocated. Hughes (2004) lists five ways of securing financial support for projects, namely:
The choice of material to be digitalised is the next step in the action plan. Figure 5.6 shows the range of collections a university library may have. In 2003 Dempsey and Childress of the OCLC Office of Research proposed this model, which shows the different types divided into four categories, each representing a different group of resources. The vertical axis represents the uniqueness of the content, while the horizontal axis shows the degree of conservation or stewardship the documents need. This grid may be highly useful when determining what kinds of documents should be digitalised.
Hazen et al. (1998) propose another strategy for choosing the most appropriate university library materials to digitalise. These authors pose a series of questions related to different aspects of the collections whose answers will guide the team in charge of the project as to whether to choose or discard the materials for digitalisation.
1. Related to the intellectual nature of the source materials, the intellectual quality of the collections must be assessed, including whether digitalisation will enhance their value, whether printed collections are sufficient for patrons and whether combining and relating them with other sources would raise their interest.
2. Related to current and potential users, it should be ascertained whether the collections to be digitalised are heavily used by patrons, whether their printed format or location hinder access to them and whether their physical conditions limit their use.
3. Related to actual and anticipated nature of use, it should be assessed whether the digitalisation of these collections will facilitate patrons’ efforts and increase the use of the collections, whether researchers agree with digitalising them and whether the librarians and information experts are willing to help out in the project.
4. Related to the format and nature of the digital product, the material characteristics and physical qualities of the collections should be assessed and their resistance to change should be determined in order to ensure that they can withstand the transformation process.
5. Related to describing, delivering and retailing the digital product, the way to make patrons aware of the existence of digital collections should be determined, along with who will authorise access to these collections and under what circumstances, and finally, how the long-term preservation of these materials will be guaranteed.
6. Related to other digital efforts, inquiries should be made as to whether there already exists a digital copy of the materials and whether other departments within the institution will help to implement the project.
7. Finally, related to costs and benefits, the benefits to be provided by digital collections should be assessed, as well as whether the value of the intellectual content of the materials is proportional to the expense of the project, whether the digital collections will entail other long-term costs and whether there will be any external financing.
Copyright is the next crucial point to bear in mind within the preparatory phase of a digitalisation project. All libraries must have a clear policy on protecting the copyright and intellectual property rights of the works in their digital collections. According to Prenafeta (2009), copyrights are the rights associated with original intellectual, artistic and scientific creations. Although it varies from country to country, we can generally distinguish between two kinds of copyrights. The first kind entails moral rights granted exclusively to the author (which are not transferable); they are protectionist and remain with the author throughout their entire lifetime and even after death (with certain conditions). The signatories59 of the Berne Convention for the Protection of Literary and Artistic Works60 do not all regulate moral rights in the same way. Continental law, which encompasses all the European countries except the United Kingdom, as well as much of Asia, Africa and Latin America, has legal systems that recognise moral rights. This tends to contrast with Anglo-Saxon law, which prevails in countries such as the United States, the United Kingdom, India and Australia; these countries usually do not recognise moral rights. One exception is the United States: even though in theory it does not admit moral rights, it does recognise these rights for visual artists in accordance with the 1990 Visual Artists Rights Act, with the proviso that these rights may be relinquished. The second kind entails proprietary rights, which are the ones that generate income, are transferable and have a limited timeframe which varies according to the laws of each country.61 Proprietary rights include the exclusive rights (to reproduction, distribution and public communication) subjected to authorisation by the holder of the rights, along with rights to remuneration and compensation. Once these monetary rights have expired, the work comes into the public domain, meaning that it may be freely consulted without the need for any kind of restriction or permission.
As Labastida Juan and Iglesias Rebollo (2006) noted, with the arrival of the digital age, more flexible legal instruments are needed that enable works with acknowledged intellectual property to move more freely and be improved by people other than their creators. Hence the appearance of copyleft, which the same authors define, within this context, as an alternative to traditional or restrictive copyright characterised by the all-encompassing phrase ‘all rights reserved’. Copyleft uses copyright to create a less restrictive system: any use of the works is allowed as long as the author is cited; when the original work is transformed into a new one, it is also distributed with the same licence. Today, the term copyleft is used to refer to all alternative licensing systems62 as opposed to the traditional ‘all rights reserved’ model.
Bearing all this in mind, and pursuant to the Handbook on Copyright and Related Issues for Libraries put out by eIFL. net, before digitalising their collections, libraries must abide by the copyright laws in their countries, in addition to negotiating the terms and conditions of use of/access to the materials with the authors and publishers. Libraries must ensure that the contracts signed with the owners of the collections are the best for their patrons and are written specifically and clearly, and they must try to form consortia with other entities in order to get better prices.
Likewise, they should not forget the principles on copyright exceptions and limitations for libraries listed in a public document published in May 2009 by eIFL.net, IFLA and the Library Copyright Alliance, which includes the most prominent scenarios, namely:
Within this section we shall highlight two main issues: the choice of software and hardware and metadata. With regard to the former, Zhang and Gourley (2008) suggest a fairly complex process for choosing suitable software and hardware in a digitalisation process. The following points should be borne in mind: analysing original materials and the structure of their contents, the types of texts and images they include, their size and location, as well as the relations among the different digital objects; identifying the different kinds of potential users of the collections to be digitalised and their needs; quantifying a real budget that facilitates library decision-making; considering the technological compatibility with the institution’s other existing infrastructures; assigning library and computer staff to the project according to their training and technical skills; and last but not least, defining the strategic goals of digitalisation and drawing up a timeline that facilitates staff efforts.
Once these needs have been defined, the next step is to evaluate the different programmes and computers following the criteria suggested by the technical team. There is no need for these criteria to be common to all digitalisation projects, although they will usually be related to the different characteristics of the programmes and computers used for digitalisation. This assessment will help the project leaders choose the most appropriate software, either open source or commercial,63 and hardware to use when digitalising the different materials.
The second factor to bear in mind within the technical requirements is the choice of the metadata that define the digital objects. The most generic definition of metadata is ‘data about data’, meaning that this concept alludes to any element that helps to identify, describe and locate electronic resources on the web. There are different kinds of metadata, but three are regarded as fundamental: descriptive metadata, which define the attributes of the digital object like its name, creator, size, etc.; administrative metadata, which include information on the object’s location, the user name, rights management, preservation, etc.; and structural metadata, which include factors like the relations among the digital objects. The process of assigning metadata to digital collections is an extremely important part of a digitalisation project in that it facilitates and maximises the accessibility of the collection. Therefore, the decisions taken on the choice of metadata standards and levels of description of these collections depend on the purposes of the organisation, the availability of human and technological resources and users’ expectations. For this reason, the metadata chosen for digital collections must meet the following criteria (NISO Framework Working Group, 2007).
They must adapt to the existing standards64 and be the most appropriate for the specific materials being digitalised, the patrons of the collection and its current and future uses. It is wise for the library to draw up an inventory of metadata standards with the descriptions and applications of each one in order to choose the most appropriate model for each digital collection.
They must be interoperable so that all the contents can be compiled and retrieved by other services. The purpose of interoperability is to help users look for and access information housed on different domains and owned by different institutions. The OAI-PHM protocol and metasearch engines, which we have discussed in the chapter on repositories, enhance the interoperability of systems.
They must use authorities control and content standards to describe and relate digital objects. Controlled terms are usually used to express the attributes of each one: personal names, corporate names, place names, subject names, etc., and each is expressed according to cataloguing rules, controlled vocabularies or thesauruses and classification systems.
As mentioned above, the digital environment has contributed to improving processes and services at libraries, but it has also turned into a breeding ground for certain problems. Even though electronic documents offer easier access to information than traditional ones, their permanence and survival over the years is not yet guaranteed.69 Today, the preservation of digital materials is an issue yet to be resolved that is of great concern to academia.
Keefer and Gallart (2007) point to three preservation strategies for digital collections which we shall outline below. They stress that libraries tend to use a combination of these methods according to the formats of the resources, their planned uses and the organisation’s technical capacity:
Refreshing, or transferring data from one support to another. More than a preservation strategy, this is considered a necessary step as it lowers the risk of data loss due to the deterioration that all physical supports undergo. It offers no solution for protecting information from the risk of technological obsolescence, but it is a necessary step in the preservation process. Furthermore, it requires neither a major investment in equipment nor a high degree of technical knowledge on the part of the staff, so it is easy to implement in all libraries.
Migration is another digital preservation strategy whose purpose is to convert a document created in a given environment and codified in a given format into another format so that it works on a new, more up-to-date or standardised IT platform. This preservation method has several advantages: as it is a proven operation, it requires no specialised technical knowledge, part of the process can be automated, and the trends towards standardisation of software and formats make the job easier and convert the document into a format that is compatible with today’s systems. Its disadvantages include the difficulties in programming complex digital objects; with the alteration of the document, you run the risk of losing important features, and the operation needs to be repeated periodically over the resource’s lifetime.
Environment emulation is the only strategy that ensures recovery of the original document with no alterations. Its purpose is for computer systems of the future to have the capacity to recover the original data, as if they were the original software. In this way, the emulator programme will allow future users to see the resource just as it was when it was created. This process requires no constant tracking of the resource’s format and offers a solution for complex digital objects, as there is no need to control each kind of format and functionality they contain. On the down side, there are still few real experiences, and programming emulators are complex and require expert knowledge, meaning that the cost is high and their usefulness will depend on the predisposition of future professionals.
As mentioned above, digital preservation is a priority issue that has been a concern to the research community for years; as a result, there has been a constant stream of reports, programmes and projects related to this subject. The European initiatives include the DELOS70 (Digital Preservation Cluster) programme, which began in 2004; the creation of the DCC71 (Digital Curation Centre) in 2004, which is the leading organisation on conservation and preservation issues; the CASPAR72 (Cultural, Artistic and Scientific Knowledge for Preservation, Access and Retrieval), DPE73 (Digital Preservation Europe) and PLANETS74 (Permanent Long- term Access through Networked Services) programmes, which got underway in 2006 and are still operating today; and finally PARSE.insight75 (Permanent Access to the Records of Science in Europe), which was launched in 2008.
American experiences include the Library of Congress’s NDIIPP76 (National Digital Information Infrastructure & Preservation Program) and the different avenues of action within RLG/OCLC77 on preservation activities.
Within the context of university libraries, worth noting are the collection preservation and maintenance department at the Cornell University Library,78 along with the LOCKSS79 (Lots of Copies Keep Stuff Safe) programmes developed by Stanford University Libraries and CLOCKSS80 (Controlled LOCKSS), in which libraries from all over the world participate.
Even though throughout this entire chapter we have reiterated that one of the most important services offered by university libraries today is access to their own digital collections, it is impossible to survey all the initiatives and experiences to date.81 Nevertheless, because of their importance, we would like to close this section on user- centred services by talking about three recent massive digitalisation projects which provide open access to hundreds of different kinds of materials and documents:
Google Books82: This is the most famous and controversial massive book digitalisation project today. Its purpose, in the words of the company itself, is to ‘facilitate the search for relevant books, especially those that cannot be found any other way, such as out-of-print books, without violating authors’ and publishers’ rights’. It has two main avenues of action. The first is a Library Project, as Google has signed agreements with numerous libraries all over the world83 to digitalise part of their book collections. The basic information on the digitalised book, and often excerpts from it, can be viewed on Google’s search engine. If the book is out of print and exempt from copyright, it can be seen in its entirety. In all cases, links are provided to the online booksellers where the book is sold. Likewise, Google Books also has a Partner Program, which the company extends to publishers and authors to allow them to include their books in the project in order to enhance their visibility. Despite the fact that at first Google Books seemed like the perfect solution for open access to information, the company has suffered from an onslaught of criticisms and sanctions since the project was launched. To cite a recent one, in December 2009 the French court system sentenced Google to a 300,000 euro fine payable to the La Martiniere publishing group for having reproduced excerpts of works without its permission.84
Europeana85 was launched in late 2008 as a vast European digital library. It includes six million works of different kinds, including literary texts, reproductions of art works, audiovisual materials and television archives, in 21 languages. The project is financed by the European Commission, and it enjoys heavy backing from the member states, which have earmarked high budgets for it. The affiliated institutions include members that are libraries, museums, archives and other entities that have helped to create Europeana, along with its partners, organisations worldwide that supply the portal with content. The project has already won an Erasmus Award for Networking Europe in 2009 (see Figure 5.7).
The World Digital Library86 was created in April 2009 by a team of professionals from the United States Library of Congress with the support of UNESCO and the help and financial support of private institutions, companies and foundations from different countries. The project includes digitalised collections of manuscripts, maps, rare books, music scores, recordings, films, engravings, photographs and blueprints. The World Digital Library interface has a unique, highly original design for accessing the information, as the constituent documents are grouped by countries, as well as by period, subject, kind of document and partner institution. All of this is supplemented with personalised descriptions of each document and interviews with the conservators on important materials. The browsing tools and content descriptions are available in Arabic, Chinese, English, French, Portuguese, Russian and Spanish.
In 2005, Tim O’Reilly defined Web 2.0, also called the social web, as the ‘business revolution in the computer industry caused by the move to the Internet as a platform and an attempt to understand the rules for success on that new platform’. There are many other definitions of Web 2.0 that describe it as follows (O’Reilly, 2005; Anderson, 2007; Kelly et al., 2009):
As the web is regarded as a platform, all the services generated on it are externally hosted so users can access them from any computer with an Internet hook-up without the need to belong to a given organisation or entity.
The upshot of the first meetings where the Web 2.0 concept was discussed was the comparison between Web 1.0 and Web 2.0 seen in Figure 5.8. It is an indisputable fact that the services and products generated in both systems are distinct and that their evolution in 2.0 reveals a change in the web paradigm.
In addition to promoting the growth and progress of services, the development of the web has entailed a change in attitude and thinking not just in technology but in many sectors. Thus, we now talk here about Science 2.0,87 Education 2.0, University 2.0, Enterprise 2.0 and Library 2.0.88 One of the most widely accepted definitions of the concept of Library 2.0 was contributed by Casey and Savastinuk (2006): ‘The heart of
Library 2.0 is user-centered change. It is a model for library service that encourages constant and purposeful change, inviting user participation in the creation of both the physical and the virtual services they want, supported by consistently evaluating services. It also attempts to reach new users and better serve current ones through improved customer-driven offerings’. Along the same lines, Blyberg (2006) justifies the existence and importance of Library 2.0 (L2) through eleven principles:
This list reveals the deep-seated change brought about by the onset of the 2.0 revolution in library services. Patrons are no longer satisfied with the traditional services that libraries have offered until now, the methods and working tools are no longer the same, and information professionals have to experiment with the new emerging technologies. These changes are not always welcomed by professionals, so the arrival of Library 2.0 has been received unevenly in different organisations. To help all the less experienced professionals who see technology as a handicap for their growth and development, we would like to cite two initiatives being spearheaded by librarians: A Guide to Using Web 2.0 in Libraries89 published by the Scottish Library and Information Council (SLIC) in December 2009, and A Librarian’s 2.0 Manifesto90 written by Laura B. Cohen and published on her blog in 2006. The guide aims to highlight the benefits of Web 2.0 for library services; these benefits include reaching a wider audience, serving a larger number and wider variety of patrons, enabling services with added value to be developed, making professional development possible and enhancing the library’s promotion and marketing. The guide also includes testimonials from professionals singing the praises of Library 2.0, case studies, documents to help readers use 2.0 tools, recommendations to bear in mind when deciding to implement it and forms where libraries can send in examples of practical uses.
Perhaps because of their degree of specialisation, university libraries were the first to begin developing 2.0 services for their patrons. Internal library tasks have also been affected by the new technologies introduced by the new web, justifying the idea that 2.0 applications know no bounds. Figure 5.9 shows several examples of the vast number of tools and resources that the social web has devised for its users: blogs, wikis, RSS feeds, social networks and many others. Below we shall discuss these tools and analyse their application in the different processes and services offered by university libraries.
Blogs are theme-based websites that are extremely easy to make. They enable users to interact with each other constantly through the comments and opinions that they post. All blogs must be constantly updated as that determines how topical and timely this tool can be. There are more than 55,000 blogs on the web91 on all imaginable topics, and they are one of the most widely used 2.0 resources in university libraries all over the world. The widespread use of blogs in library services is largely due to the existence of free programmes that make it easy to create them, coupled with the low level of technological knowledge needed to develop a blog. Wordpress,92 Blogger93 and Live Journal94 are free, external applications that require no previous installation, just registration in a system through which each user gets a username and password that let them manage the blog from any computer with an Internet hook-up.
As a visual bulletin board and newsletter of the library managed and updated by the librarians with the motto ‘at least one news item every day’. This initiative facilitates visibility, promotes the service and keeps the entire university community apprised of the latest developments at the library.
Just to cite one of the many examples of blogs at university libraries today, the librarians at the North Carolina State University Library95 have a blog that falls within the last category listed above. The news items posted on this blog are extremely varied and range from new product displays to profiles of the staff working at the library.
‘They are Websites that allow the easy creation and editing of any number of interlinked Web pages via Web browser’.96
Wikipedia is the most popular and well-known application of this tool, a web project whose goal is to develop a free encyclopaedia through user participation and contributions. Wikipedia has become an extremely important source of information that is consulted quite frequently, and its growth is unstoppable despite the fact that some unfavourable criticism has been levelled at it.97 For example, the results of a survey administered to 68 high-tech professionals about how they use and share information from Wikipedia for work purposes revealed that these experts view it as a general source of reference information, but one that is not very rigorous, that is still under development and whose editorial process needs improvement. As a result, these professionals expressed little interest in contributing information to it (Chen, 2009).
Wikis are not one of the 2.0 resources most frequently used in university libraries, despite the fact that just like with blogs, free services like Wikispaces98 and Wikia99 are used to create and host them. A study conducted in 2009 on the use of wikis in this kind of library reveals that of the 48 respondents,100 only 16 use this tool for their work, four have tried it, 13 are planning to use it and 15 have no intention of using it. According to this analysis (Chu, 2009), the main reasons why these libraries use wikis are the following:
In addition to these practical applications, wikis are taking shape as the ideal 2.0 tool for teamwork. They enable librarians to share their work, facilitating jobs like editing, creating and eliminating contents. If a university uses a decentralised model in which each faculty or school has its own library, wikis can be extremely helpful in technical jobs, cataloguing, subject standardisation and classification, or in drawing up thematic guides, as some of the faculties at the University of Seville, Spain,101 have done. There, they use wikis to keep compilations of electronic or printed information resources related to the courses taught at the university. They are constantly growing, being revised and updated, they support the courses and their goal is to gather opinions and suggestions from members of the university community in order to further enrich the contents.
These are shared websites whose main goal is for registered users to share information. Generally speaking, the most famous networks are Facebook102 and Myspace,103 but there are also others used in more restricted or professional spheres, like Linkedin,104 Xing105 and Viadeo.106 These tools are very popular and heavily used, but they have also been the subject of criticism about the lack of privacy in certain cases. For this reason, all users of social networks must be aware of the dangers and risks involved and must be trained in how to use them correctly.
The main uses of social networks in university libraries are to serve first as a channel of dissemination and marketing of the libraries’ products and services, and secondly as a means of communication among professionals from different regions and countries. There is no doubt that Facebook is the network that the majority of libraries choose to promote themselves.107 Most of them create a profile containing their basic information and photographs, and then they use the profile to announce events, special hours, news, job offers, interruptions in any service, statistics, catalogue searches, article database searches, metasearches and any other new development in the library setting. The most original Facebook applications are the ones spotlighted by Boyer and Ryan in the 2009 DLF Spring Forum:108 a clickable map to notify friends of the physical location of Swem Library at the College of William and Mary (Williamsburg, Virginia), Eastern Illinois University’s Live Reference Chat and an application designed by North Carolina State University Libraries to help students meet up in physical library space for planned or ad hoc activities, such as group study sessions. The application provides an ‘at a glance’ view of all activities in the library, activities by the users’ friends and useful library information, such as hours and study room availability. Users can join activities created by others, or can simply broadcast their own activity to the community.
According to the definition by Rodriguez Gairin et al. (2006: 214), ‘Syndication is the process whereby a producer or distributor of contents on Internet provides these contents to a subscriber or to a network of subscribers’. This is a way of distributing information, making it more accessible and also allowing for alerts and updates without the need to constantly browse the Internet. This redistribution mainly takes place in two formats: RSS (Really Simple Syndication) feeds and Atom. In order for content syndication to be possible, a compatible reader and web browser are needed. There is a wide range of RSS readers, but they can all be classified into three categories:
Content syndication has been widely accepted in university libraries114 and serves a variety of functions, including:
Alerts on new additions to the library catalogue. For years the channel of communication that libraries used to announce their new acquisitions was e-mail; at the beginning of every week or month, the library professional had to remember to send an updated file containing the new catalogue items. With content syndication, each library user can receive this information automatically without the librarian having to do anything.
Alerts on documents published in e-journals and databases to which the library has subscribed, so that users can receive all the relevant updates without the need for them to visit the journal websites every time a new issue is published.
Many university libraries have implemented services of this kind on their web portals or blogs. Just to cite a few examples, the University of Oklahoma Libraries offers its users the option of subscribing to library news via RSS. This news is categorised in three ways: by collection, subject or title.115 Likewise, the University of Saskatchewan Library in Canada proposes content syndication for the e-journals to which it subscribes.116
Social bookmarking, also called tagging, is another Web 2.0 tool used to store links on the Internet with the goal of sharing them. In this way, each user can share their own resources and benefit from those compiled by others. The advantages that Grey (2005) attributes to this social tool are listed below:
Del.icio.us117 is perhaps the most widely used service to collectively manage social bookmarks at university libraries.118 It has been up and running since 2003 and has a user-friendly interface, as it enables bookmarks originally stored on browsers to be added and organised through a system of tags. Gallart (2009) claims that social bookmarks are used in libraries to achieve the following goals:
This includes a group of tools aimed at hosting and sharing different kinds of digital artefacts such as documents, photographs, videos, presentations and recordings. They are free social services as they allow users to send comments, opinions and ratings. Just like the tools discussed above, there is a variety of solutions that make this shared information management possible, but the most successful ones are GoogleDocs119 for documents, which enables users to create, edit, publish and share different contents in different kinds of office documents; YouTube,120 a highly popular service where you can upload and share videos; SlideShare121 for presentations; and Flickr122 for photographs.
Some libraries, especially in the United States, are beginning to use these websites to complement their face-to-face services, to offer new services or simply to seek new avenues of communication with their patrons. The majority of the experiences are virtual tours of the library, such as the videos posted on YouTube from the University Library at Wisconsin- Stout123 and the Tour of Library West at the University of Florida,124 where all the library services are shown along with an explanation of each service by the person in charge. However, delving a bit further, these tools are also perfect for sharing virtual exhibitions and for providing user training and library orientation every academic year.
These are computer applications which can be freely used, copied, studied, changed and redistributed, plus they are also classified as social, similar to the ones described above, as they include a variety of options of sharing and collaborating with other users. With the advent of Web 2.0, many library jobs and tools have been improved with new solutions that facilitate the job of professionals. This is the case of integrated library management systems (ILS),125 more specifically the OPAC module,126 reference managers, content management systems and e-learning platforms.
Bibliographic reference managers are extremely useful tools for university libraries that enormously facilitate the work of researchers. They enable users to create their own bibliographic databases by adding items or by importing the references found after checking databases. They also allow users to organise their own personal bibliography, draw up bibliographic lists which are automatically formatted correctly for inclusion in any study, insert citations into a document being written and share bibliographic reference files with other colleagues and publish them on the web. Zotero127 and Mendeley128 are open-source reference management applications that use 2.0 features and utilities such as social networks to share files with other researchers working on similar subjects, plus they are compatible with products and services from the social web like YouTube and Flickr. Likewise, CiteULike129 and Connotea130 are social reference managers that pursue the same goal as the aforementioned ones, storing millions of citations from research documents for each user that are visible to all users. CiteULike includes services like CiteGeist, which shows the most popular references in recent days, and Watchlist, which offers users tracking lists to show shared interests and to stay abreast of the new documents that each user is reading.
Content management systems (CMS) are applications that help users edit and manage information through a web browser. In addition to offering a clear, simple administrative system through which all the web contents are managed, they also include a variety of 2.0 utilities: blogs, surveys, news channels, ratings and comments through tags, etc., which make them a social tool in which users shift from being mere spectators to actively providing content. Joomla131 and Drupal132 are two solutions used in university libraries to implement the institutional website or intranet. The University of León (Spain) has implemented its corporate website with Joomla, while the University of Minnesota Libraries has designed the website of its Biomedical Library using Drupal.133
E-learning platforms are used at university libraries for virtual user training to complement face-to-face training. They are dynamic learning environments aimed at facilitating the teaching/learning process. Prominent examples of e-learning platforms include Moodle (Modular Object Oriented Dynamic Learning Environment),134 which has been implemented at many universities all over the world, and Sakai,135 which began to be used in the United States but has quickly expanded to other parts of the world. Both solutions allow 2.0 tools to be included like blogs, wikis, discussion forums and RSS channels, which make them more open and interactive. Just to cite several examples of university libraries that use Moodle, the Carleton College Library offers library resources that can be included in the courses in which students are enrolled via Moodle,136 and The London School of Economics and Political Science of the University of London and the Library User Education Team portal complements face-to-face training.
Technological advances pave the way for the advent of new web-related concepts which are extensions of the web. In this chapter, we have spoken about Web 2.0, which entails user involvement in content creation as opposed to just content consumption. Yet new web developments are also in the works, including Web 3.0, where information is semantic in nature and can be interpreted by both humans and machines, and Web 4.0, in which intelligent personal agents and distributed search will come to the forefront.
ACRL/ALA, Information Literacy Competency Standards for Higher Education, 2000. Available from:. http://www.ala.org/ala/mgrps/divs/acrl/standards/informationliteracycompetency.cfm
ACRL. Characteristics of Programs of Information Literacy that Illustrate Best Practices: A Guideline. Available from: http://www.ala.org/ala/mgrps/divs/acrl/standards/characteristics.cfm, 2003.
ALA’s Office for Information Technology Policy, Draft Principles for Digitized Content, 2007. Available from:. http://dltj.org/article/ala-oitp-digital-proposal
Anderson, P., What is Web 2.0? Ideas, technologies and implications for education. JISC Technology & Standards Watch. 2007 Available from:. http://www.jisc.ac.uk/media/documents/techwatch/tsw0701b.pdf
Bainton, T., Information literacy and academic libraries: The SCONUL approach (UK/Ireland). Proceedings of the 67th IFLA Council and General Conference, 2001 Available from:. http://archive.ifla.org/IV/ifla67/papers/016-126e.pdf
Blyberg, J., 11 reason why Library 2.0 exists and matters, 2006. Available from:. http://www.blyberg.net/2006/01/09/11-reasons-why-library-20-exists-and-matters
Boyer, J., Ryan, J., Considering Facebook in the library. Proceedings of the Digital Library Federation, Spring Forum. 2009 Available from:. http://www.diglib.org/forums/spring2009/presentations/Boyer.pdf
Brown, S., Swan, A. Researchers’ use of academic libraries and their services, a report by the Research Information Network and the Consortium of Research Libraries. Available from: http://eprints.ecs.soton.ac.uk/13868/1/libraries-report-2007.pdf, 2007.
Casey, M., Savastinuk, L.C., Library 2.0: Service for the next-generation library. Library Journal. 2006 Available from:. http://www.libraryjournal.com/article/CA6365200.html
Catts, R., Lau, J., Towards Information Literacy Indicators. UNESCO, Paris, 2008. Available from:. http://www.uis.unesco.org/template/pdf/cscl/InfoLit.pdf
Cohen, L., A Librarian’s 2.0 Manifesto, 2006. Available from:. http://liblogs.albany.edu/library20/2006/11/a_librarians_20_manifesto.html
Crow, R., The Case for Institutional Repositories: A SPARC Position Paper, 2002. Available from:. http://www.arl.org/sparc/bm~doc/ir_final_release_102.pdf
Egan, N. The impact of electronic full-text resources on inter-library loan: A ten year study at John Jay College of Criminal Justice. Journal of Interlibrary Loan, Document Delivery and Electronic Reserve. 2005; 15(3):23–41.
eIFL.net, Handbook on Copyright and Related Issues for Libraries, 2009. Available from:. http://www.eifl.net/cps/sections/services/eifl-ip/issues/handbook/handbook-complete-text/downloadFile/file/handbook2009_en.pdf?nocache=1256507140.35
Gallart, N., Delicious en la Biblioteca Universitaria de Sabadell UAB, 2009. Available from:. http://comunidad20.sedic.es/?p=278
Grey, D. Social Bookmarking – More Than Meets the Eye. Available from: http://denham.typepad.com/km/2005/01/social_bookmark.html, 2005.
Harnad, S., McGovern, N., Institutional repository success is dependent upon mandates. Bulletin of the American Society for Information Science and Technology. 2009;35(4) Available from:. http://www.asis.org/Bulletin/Apr-09/AprMay09_Harnad-McGovern.pdf
Hazen, D., Horrell, J., Merrill-Oldham, J. Selecting Research Collections for Digitization. Available from: http://www.clir.org/pubs/reports/hazen/pub74.html, 1998.
Hedlund, T., Rabow, I. Open Access in the Nordic Countries: A State of the Art Report. Available from: http://www.nordforsk.org/_img/oa_report_020707.pdf, 2007.
Hernández Pérez, T., Rodríguez Mateos, D., Bueno de la Fuente, G., Open Access: El papel de las bibliotecas en los repositorios institucionales de acceso abierto. Anales de Documentación, 10. 2007 Available from:. http://revistas.um.es/analesdoc/article/viewFile/1141/1191
Kennan, M.A., Kingsley, D., The state of the nation: A snapshot of Australian institutional repositories. First Monday. 2009;14(2) Available from:. http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/viewArticle/2282/2092
Labastida Juan, I., Iglesias Rebollo, C., Guía sobre gestión de derechos de autor y acceso abierto en bibliotecas, servicios de documentación y archivos. SEDIC, Madrid, 2006. Available from:. http://www.sedic.es/dchos_autor_normaweb.01.07.pdf
Lau, J., Guidelines on Information Literacy for Lifelong Learning. IFLA, Veracruz, Mexico, 2006. Available from:. http://archive.ifla.org/VII/s42/pub/ILGuidelines2006.pdf
Lynch, C.A., Institutional repositories: Essential infrastructure for scholarship in the Digital Age. ARL: Bimonthly report, No. 226. 2003 Available from:. http://www.arl.org/resources/pubs/br/br226/br226ir.shtml
Mark, T., Shearer, K. ‘Institutional repositories: A review of content recruitment strategies’, World Library and Information Congress: 72nd IFLA General Conference and Council. Available from: http://archive.ifla.org/IV/ifla72/papers/155-Mark_Shearer-en.pdf, 2006.
Markey, K., et al, Census of Institutional Repositories in the United States: MIRACLE Project Research Findings, 2007. Available from:. http://www.clir.org/pubs/reports/pub140/pub140.pdf
Marti, M.C., et al, Alfabetización digital: Un peldaño hacia la sociedad de la información. Medicina y Seguridad del Trabajo. 2008;54(210) Available from:. http://scielo.isciii.es/pdf/mesetra/v54n210/especial2.pdf
Melero, R., et al. Situación de los repositorios institucionales en España: Informe 2009. Available from: http://digital.csic.es/bitstream/10261/11354/1fInforme2009-Repositorios_0.pdf, 2009.
NISO Framework Working Group, A Framework of Guidance for Building Good Digital Collections. National Information Standards Organization, Baltimore, 2007. Available from:. http://www.niso.org/publications/rp/framework3.pdf
O’Reilly, T., What is Web 2.0: Design Patterns and Business Models for the Next Generation of Software, 2005. Available from:. http://oreilly.com/web2/archive/what-is-web-20.html
Ottaviani, J., Hank, C., Libraries should lead the institutional repository initiative and development at their institutions. Bulletin of the American Society for Information Science and Technology. 2009;35(4) Available from:. http://www.asis.org/Bulletin/Apr-09/AprMay09_0ttaviani-Hank.pdf
Prenafeta, J., Contenido de los derechos de autor: Derechos morales y derechos patrimoniales, 2009. Available from:. http://es.safecreative.net/2009/01/15/contenido-de-los-derechos-de-autor-derechos-morales-y-derechos-patrimoniales
Rumsey, S., O’Steen, B., OAI-ORE, PRESERV2 and digital preservation. Ariadne 2008; 57 Available from:. http://www.ariadne.ac.uk/issue57/rumsey-osteen/
Scottish Library, Information Council, Guide to Using Web 2.0 in Libraries, 2009. Available from:. http://www.slainte.org.uk/files/pdf/web2/Web2GuidelinesFinal.pdf
Smith, A., Why Digitize, 1999. Available from:. http://www.clir.org/pubs/reports/pub80-smith/pub80.html
Suber, P., Budapest Open Access Initiative: Frequently Asked Questions, 2007. Available from:. http://www.earlham.edu/~peters/fos/boaifaq.htm
Swan, A., Brown, S., Open Access Self-archiving: An Author Study – Key Perspectives, 2005. Available from:. http://www.jisc.ac.uk/uploaded_documents/Open%20Access%20Self%20Archiving-an%20author%20study.pdf
van Eijndhoven, K., van der Graaf, M. Inventory Study into the Present Type and Level of OAI Compliant Digital Repository Activities in the EU. Available from: http://www.driver-support.eu/documents/DRlVER%20lnventory%20study%202007.pdf, 2007.
Zuccala, A., Oppenheim, C., Dhiensa, R., Managing and evaluating digital repositories. Information Research. 2008;13(1) Available from:. http://informationr.net/ir/13-1/paper333.html
9.At many universities, students attending these specialised courses can earn academic credit for them, leading them to gain interest within the student community.
10.A service offered by the Malmo University Library (Sweden). People can book a librarian for a free individual 60 minute session of guided information searching http://www.mah.se/english/Library/Services/Book-a-Librarian
12.The e-prints are preliminary digital versions of a work that the author intends to send to a formal journal, but that are previously distributed among a group of departments or colleagues who are working in the same area of interest, firstly with the aim of receiving reviews, comments and suggestions, and secondly, of announcing the trends and results of the research that he/she is developing.
17.They are Harnad and Suber’s answers about a question asked by Remedios Melero. She disseminates them in 1st Conference of Os-repositories (Zaragoza, Spain, 2006).
32.It is what authors call 'interoperability'.
37.Google Scholar http://scholar.google.com and Scirus http://www.scirus.com are academic search engines. In 2006 Microsoft released its Live Search Academic, which was withdrawn in 2008. This multinational launched in 2009 another scholar search engine very similar to the previous one http://academic.research.microsoft.com/
40.Digital Repositories Programme, which remained until 2008, and Repositories and Preservation Programme, which is still operative, are two of the programmes released by JISC about digital repositories. SHERPA displays its services, resources and projects on the home page of its web.
41.JISC was one of the organisers of the International Repositories Workshop, held in the Netherlands in March 2009.
43.To cite only a few projects: LOREnet http://www.lorenet.nl/nl/page/luzi/show?showcase=1 is a repository network of teaching materials, and DAREnet, a repository network created in 2004, but that from 2008 became part of NARCIS http://www.narcis.info/index a Dutch scientific and academic portal.
44.It is worth mentioning the census of institutional repositories in the United States, under the umbrella of the MIRACLE project (Markey et al., 2007), the report on Open Access in the Nordic countries (Hedlund and Rabow, 2007), the study on the European situation of the institutional repositories (van Eijndhoven and van der Graaf, 2007), the analysis of the repositories in Australia (Kennan and Kingsley, 2009) and in Spain, the report made by Melero et al. in 2009.
45.The Dublin Core began in a meeting held in Dublin (Ohio) in 1995, and was conceived with a view to describe Internet resources and as a response to the need of creating a common standard of metadata. It consists of fifteen description elements of electronic documents (title, creator, subject, description, publisher, contributor, date, type, format, identifier, language, source, relation, coverage, rights), which provide the basic information about them. http://dublincore.org/ The Qualified Dublin Core is an extension of the Dublin Core, where some of its elements are accompanied by a qualifier, which makes them more restrictive (e.g. Date.Created, Date.Available, Date.Modified) http://dublincore.org/documents/2000/07/11/dcmes-qualifiers/
49.This method is proposed in order to minimise efforts on the part of the author when placing contents in a repository. If the information relative to the researchers of their curricula, lists of publications and other services is collected, then the librarian will take part in the self-archive and the author will have to work less.
51.These are reduced reproductions of documents that require a reader or projector to be viewed.
58.To cite just a few examples: Good Practice Guide for Developers of Cultural Heritage, a website made in 2004 by UKOLN and Arts and Humanities Data Service and the University of Bath and updated in 2008. Also: Framework of Guidance for Building Good Digital Collections, prepared by the NISO Framework Working Group with support from the Institute of Museum and Library Services, the third edition of which dates from December 2007.
59.163 countries have signed the convention http://www.worldcopyrightcenter.com/signatories-berne-convention.html
61.According to the Berne Convention, the minimum is the author's lifetime plus another 50 years.
63.To cite just one example, OCLC's CONTENTdm (http://www.contentdm.org/) and Greenstone, the latter developed for the New Zealand Digital Library Project at the University of Waikato (http://www.greenstone.org/), are two programmes often used to create digital collections.
64.Dublin Core, RDF, EAD, TEI and MODS are just some of the models used to describe digital collections.
69.The flimsiness of physical supports and technological obsolescence are the main threats to digital collections, so digital preservation aims to conserve both facets of the document: the physical part and its contents.
84.This news item appeared in the online newspaper Elmundo.es on Friday 18 December 2009 http://www.elmundo.es/elmundo/2009/12/18/navegante/1261141624.html
87.With Science 2.0 there is a change in both researchers' aptitudes and the way information is written and disseminated. Scientific communication is enriched by the appearance of new methods of storage and publication.
88.All the 2.0 concepts share the same characteristics as those listed for Web 2.0 at the beginning of this chapter.
97.Among academic researchers, Wikipedia is not regarded as a highly reliable source.
100.The respondents are academic libraries in several regions around the world: Australia, China, Hong Kong, Singapore, New Zealand and the United States.
107.One example of this is the 1,234 members belonging to the group 'Libraries using Facebook pages'. Figure retrieved on 16 January 2010.
118.List of libraries that use Del.icio.us to share bookmarks. Most of them classify the links by library subjects or jobs, depending on whether the service is used internally or is for users.
126.We have addressed this point in the chapter on catalogues.