Feedback Evaluation – Stage 2

I felt the feedback process was easier this time, possibly because having reached the end of the unit my understanding of the assignment tasks was clearer or possibly because I felt a little more comfortable having done it before. Unlike the first post in which I found reading other student’s work somewhat confusing this time there was greater clarity.  It certainly helped that the other students in my group were also undertaking Information Learning Activities (ILA) in a university setting. I certainly wouldn’t feel equipped to evaluate ILAs in a school setting as I don’t have the professional experience or knowledge.

In providing feedback for other students I was able to point out that one of the students hadn’t related their discussion to Bloom’s Revised Taxonomy which was mentioned in the criteria sheet. My team mate then added this to her post which I was pleased about as it made it a positive experience for me.  I also mentioned a point that I thought needed clarifying in the methodology to another team mate which I hoped they found useful. Reading this student’s blog identified an Information Literacy model, namely Bloom’s Digital Taxonomy that I had not thought to use in my discussion but fitted in perfectly and so this added value to my work.

Likewise the feedback I received was very positive which is heartening with comments such as “I liked your reference to Jamie McKenzie’s, The Research Cycle, as this was not one of the models I used. It fits well with your cohort of learners” and “I agree with all the recommendations made, especially one and two”.

The positive feedback was great but another comment which was in relation to an observation that I made about the delivery and content of my ILA ………“It would be interesting to hear your view on how you would change this to maximise the effectiveness of the learning activity” was extremely helpful. Because of this I edited my discussion blog clarifying my point.  I would say in all honesty that I would have appreciated more concrete criticism as I was looking for points and assistance that would enhance my submission and it was the critical evaluation that was the most valuable.  Of course, I have to acknowledge at the same time that I was reluctant to be overly critical myself of others’ work.  Overall though I would have to acknowledge that it was a beneficial exercise and forced me to contact other students which helped negate the feeling of isolation that online studying can foster.


Discussion and Recommendations – Stage 2

As higher research students these students present with varying degrees of information literacy skills. Factors such as previous information literacy education, background and age mean that not all doctoral students may necessarily have the research skills essential to complete their dissertations. These students need to be well prepared to adjust and “keep up with the rapidly changing library services and resources in the 21st century” (Tunon & Ramirez, 2010, p.993).

When evaluating their responses against the Australian and New Zealand Information Literacy Standards (Bundy, 2004, p.11) the students all recognised their ability to ascertain their information needs as well developed but stated that the areas of critically evaluating, finding relevant information, analysing and gaining new knowledge and managing sources were the key areas of concern. These capabilities can be identified as the Higher Order Thinking Skills (HOTS) in Blooms’ Revised Taxonomy. In the three psychological domains proposed by Bloom these skills belonged to the cognitive domain comprising Knowledge, Comprehension, Application, Analysis, Synthesis and Evaluation (Churches, 2009, p.3). These students have already decided on their topic and have chosen an appropriate area of research.  In Kuhlthau’s Information Search Process (ISP) model they can be situated in the collection stage having already selected and identified their focus and are now searching for information to support their research (2007, p.18). The affective context of the ISP model places the students at a confident and focussed stage and this was confirmed by my observation of them throughout the workshops. Whilst Kuhlthau’s ISP model does not discuss the management of data other models do. Jamie McKenzie’s, The Research Cycle, discusses the need for strategies for managing information. Identifying the important and relevant sources in the ‘sorting and sifting stage’ is assisted if students have good management strategies as this enhances the ability to sort “’signals’ (information that illuminates) over ‘noise’ (information that befuddles)” (McKenzie, 2000, planning para.).

In these workshops effective search techniques and strategies were demonstrated before the students practiced them. As these students had different levels of information literacy skill some required more assistance than others. The citation searching exercise required the most guidance and help from the facilitator. Many of the inquiry learning models discuss the need for intervention or ‘scaffolding’. Bloom’s Revised Taxonomy (2009, p.9) notes that scaffolding is essential to provide the foundations of learning and support the constructivist process.

In Lupton and Bruce’s’ GeST model (2010, p.4) the students’ abilities can be associated with all three perspectives. They all exhibit the behavioural or generic skills necessary to perform as higher research students such as using information technology to locate relevant information. In the situated literacies these students participate in a variety of scenarios. When describing her topic one student explained that she “produced toolkits, designed and implemented workshops and participated in conferences” thus demonstrating a variety of situated literacies in her professional life. The transformative perspective is represented in their academic studies. Another student’s topic dealt with the psychological issues pertaining to paramedics. The situated and  transformative view of information literacy “empowers students to be active designers – makers of social futures” (Cope & Kalantzis as cited by Lupton & Bruce, 2010, p.22). As doctoral candidates these students will be writing a thesis that will evaluate current research and formulate new discussion in their chosen discipline.

In today’s society digital literacy is essential for any higher degree student. These students digital capabilities were for the most part well developed as you would expect of students at this stage of their academic studies. According to the Digital Information Fluency model (2011, para. 1) “digital information fluency is the ability to find, evaluate and use digital information effectively, efficiently and ethically”.  Models such Callison’s ‘Information Inquiry’ (2006, p.15) and Eisenberg’s Information Literacy Model, the Big6 (2012, Evaluation para.) acknowledge the significance of the integration of information technology skills and information literacy skills. With internet searching the ability to critically evaluate sources has become paramount as standard editorial evaluation of sources no longer applies.  As Grafstein  (2007, p.60) points out,  students require higher order thinking skills to evaluate information. It is noteworthy that the students who participated in this workshop also identified this as an issue. As pointed out by Lupton in the Australian and New Zealand Information Literacy Framework students need to be able to use and adapt to new information communication technologies but it is the ability to critically evaluate information that defines legitimate information literacy skills.  “Information literacy initiates, sustains, and extends lifelong learning through abilities that may use technologies but are ultimately independent of them” (2004, p.4). In one of the models delivered in this program networking sites are discussed and demonstrated to the students. I did not include a question about this in my surveys as it is not a hands-on activity. However I observed that all of the students took notes and questioned the facilitator extensively. In this globally digital world the ability to network with other researchers in a specific discipline, sharing knowledge and discussion is of great value. Bloom’s Digital Taxonomy (2009, p.10) situates this as a Lower Order Thinking Skill, the ability to retrieve or remember important skills to “maintain all of the current relevant knowledge for their learning”.

Assessment of this learning activity adds value as it reinforces the learning content for the students and enables the facilitator to assess the success of the program.  According to Harada and Yoshina (as cited by Kuhlthau, Maniotes & Casperi, 2004, p.111) it is critical as it serves as a tool to assist the students to identify their strengths and weaknesses. Assessment acts as a form of reflection allowing the student the opportunity to reflect and evaluate upon what they have learned. Many information literacy models and inquiry based models, such as the Alberta Inquiry Model (2004, p.10)  discuss the importance of evaluation to enable students to transfer knowledge to new situations.


  • I would recommend that the workshops be redesigned to be more discipline specific which would enable the facilitator to tailor the content more appropriately. At times throughout I had to help the students select more appropriate databases and then assist them with the functionality of the search page. For some disciplines a multidisciplinary database may provide results but the students will be more successful if they are directed to subject specific databases.
  • More staff to assist in these workshops would be beneficial. The disparity in skills that the students present with can make it difficult for the facilitator to provide them with the necessary assistance. My presence throughout these workshops greatly assisted. The facilitator who ran the program commented that it can be difficult to stay on track and deliver all of the content because some of the students require extensive assistance. This could be caused by language barriers in the case of International students or lack of technological capabilities with mature students.
  • I would recommend that all students who enroll in the program be contacted by their liaison librarian to provide them with an overview of the library webpage, catalogue, databases and services. Familiarising students with the catalogue and databases prior to commencement of the program would deliver better outcomes for the students.
  • That the period for submission of assessment be reduced from six to one month after completion of the course. It was commented by the facilitator that many students are very slow in submitting their assessments. Completing the assessment closer to the program will result in better retention for the students.

This content and delivery of this program is modeled on traditional library instruction classes; a combination of lectures, demonstrations and time to practice. In the truest sense this is a modified guided inquiry activity.  Most of these students demonstrated well developed information literacy skills as would be expected at this stage of their studies. Bearing in mind the amount of content that needs to be delivered and the cognitive skills of these students the delivery and content is appropriate and this is supported by the results and data analysis of this information learning activity.


Alberta Inquiry Model (2004). Focus On Inquiry: A teachers guide to implementing Inquiry-based learning. Retrieved from

Bundy,. A. (Ed.). (2004). Australian and New Zealand Information Literacy Standards. Retrieved from Queensland University of Technology Course Materials Database.

Callison, D. (2006). The blue book on information age inquiry, instruction and literacy. Retrieved from Queensland University of Technology Course Materials Database

Churches, A. (2009).  Bloom’s Digital Taxonomy. Retrieved from

Eisenberg, M.  (2012). Big6 Skills Overview.  Retrieved  from skills-overview.php

Grafstien, A. (2007). Information literacy and technology: an examination of some issues.  Portal: Libraries and the Academy.  7(1), 51-64. Retrieved from

Kuhlthau, C., Maniotes, L.K. & Caspari (2007). Guided Inquiry: Learning in the 21st Century. Westport CT: Libraries Inc.

Lupton, M. & Bruce, C. (2010). Windows on Information Literacy Worlds: Generic, Situated and Transformative Perspectives. In A. LLoyed & S. Talja (Eds.), Practising information literacy: Bringing theories and information literacy together. (pp.3 – 27). Retrieved from Queensland University of Technology Course Material Database.

McKenzie, J. (2000). The Research Cycle. Retrieved from

Tunon, J. & Ramirez L. (2010) ABD or EdD? A Model of Library Training for Distance Doctoral Students. Journal of Library Administration. 50:7-8, 989-996. doi:10.1080/01930826.2010.489004

21st Century Information Fluency (2011). Digital Information Fluency Model. 21st Century Information Fluency  Retrieved from displayName=Come+to+the+tutorial+ready+to+describe+this+model&course_id=_84672_1&navItem=content&

Reflection on Feedback – Blog Stage 1

I actually found this to be one of the more difficult parts of the assessment so far. Well not the receiving part of the feedback but rather the giving.  I felt uncomfortable in this role probably because of my age and my previous educational experiences which have been based on more traditional pedagogy. Or perhaps it is because I am very shy. In any event there were definite ‘barriers’ to this part.

When I read other students’ work it made me question my own understanding of the topic and task. But rather than clarifying the task for me it created more anxiety for me. However as I reflect upon this now I can see that the more I questioned myself about the topic the better my understanding would/should be. I noticed that there were some posts in my peers’ posts that I had forgotten to do, namely the Google Search, and this reminded me to do it. So the evaluation process acted as a checklist for me.

In the evaluation of my work one of my peers suggested that I did some more screenshots of a certain search to clarify it better which was good advice. The other feedback I received was about the quality of sources that I found and I found this very positive. Perhaps I will become more comfortable with this part of the assessment the next time round.

Questionnaire 3 – Stage 2

Questionnaire 3


  1. 1.       Take some time to think about your topic. Now write down what you know about it.

I have gained a lot more knowledge about inquiry based learning in general. Specifically in conducting my ILA I have found quite extensive literature on its role in academic libraries and the advantages of inquiry based or problem based information literacy classes for university students.

2. How interested are you in this topic?  Check (ü) one box that best matches your interest.

Not at all    not much ☐    quite a bit ☐    a great deal

My interest level has certainly increased with the reading of the literature and conducting my own evaluation of a program that we provide here.


3. How much do you know about this topic?  Check (ü) one box that best matches how much you know.

Nothing      not much     quite a bit     a great deal

4. Thinking back on your research project, what did you find easiest to do? Please mention as many things as you like.

  • Evaluating sources
  • Finding relevant information
  • Constructing new meaning from old and new information

5. Thinking back on your research project, what did you find most difficult to do? Please mention as many things as you like.

Managing sources – I am sorry to say that that I am still as disorganised as ever. I have numerous folders containing journal articles, information literacy standards etc. on my desktop and  USB with stupid names like “good research”  and “very relevant” which of course means that I have to “sort and sift” through them constantly. I even have print copies littered all around my desk (with post-it notes saying “do not touch – important” written on them). I certainly did not use Evernote to its full potential – mainly using it as a communication tool for my Blog posts.

Determining the extent of the information required is still a problem – ever the librarian I love the search process and once I am on an information seeking mission I hate to stop.

Getting on with it – I need to spend less time thinking about it and more time doing it. Could someone please invent a procrastination pill. I’m sure there is a huge market for it.


6. What did you learn in doing this research project?

I struggled with some of the technology involved, such as inserting screen shots into Blogs – fortunately working where I do I was able to pick the brains of co-workers. Likewise with Excel (which I have never used before – I know this is hard to believe) I spent days fiddling trying to create graphs becoming more and more frustrated. I learnt how to ask for help but I wish I had done it a lot sooner which would have saved me a lot of angst.

On a personal note I discovered that I am not the only person in this house who can a) iron a shirt and b) cook a meal – yay!


7.  How do you now feel about your research? Check (ü) one box that best matches how you feel.

Unhappy  – I don’t feel confident with how it turned out   

Confused – I don’t really know what I was looking for

Moderately Confident – I think it turned out OK

Happy – I’m really happy with how it turned out

Note that I only said “moderately confident”. I guess we will have to wait for the final grade which is not really going to be a true indication. It is an unfortunate fact that we have to have a grading system to measure students but that is the way it is. Regardless of this I know I have learnt a great deal about the topic, my own research skills and strengths and weaknesses as a student.

Results Analysis – Stage 2

As the four different workshops are run continuously throughout the year it wasn’t certain that I was going to encounter the same students. Likewise not everyone completed the surveys. However I ended up with ten students who completed the questionnaires and that I was able to observe throughout the four week period. In completing the questionnaires the responses were often minimal; this was particularly noticeable by the third. This may be because 60% of the students were international and as English is not their first language they may have needed more time to formulate their responses. However I think it is more likely that this was a limitation of the method I used. The students were not particularly interested in my project or motivated to assist. A focus group with interviews would have been a more successful method. It is interesting to note that two of the students gave almost identical responses in all questionnaires. For example Student A throughout consistently rated his skill levels at the highest point and Student C almost consistently rated her skill levels fairly lowly and indicated no improvement. However my observations of both of them indicated that they both appeared to be very competent.

Statement Types

As you can see from the above graph the pattern was that there were far more evaluating statements than facts or conclusions. In some cases the students’ responses were almost identical in all three questionnaires from which we can conclude that the students weren’t aware that I was actually evaluating this response. One student just used bullet points which listed relevant work experience such as the design and implementation of workshops, toolkits and case studies. So despite there being little or no evaluating or concluding sentences in the responses it was still evident that she had quite extensive knowledge. Another response answered with a lot of detail and contained facts, evaluation and conclusions using phrases such as ‘explores sustainability’, ‘specifically focussing of practices of re-use and recycling’ and ‘aiming  to develop a model’.

I would have expected far more complex answers from these students which would have resulted in more conclusions. Higher research students would have a fairly comprehensive knowledge of their topic. In assisting with these workshops I was able to engage with the students and discuss their research areas. It was quite apparent that from conversations with them that they did have substantial knowledge of their topic. I believe the above results are a limitation of the questionnaire method in that I should have created more room for their responses. As well I didn’t explain to the students what sort of response I was expecting here. More direction would have resulted in more detailed responses.

Knowledge of Topic

In the above question the students were asked to identify how much they knew about the topic. It is interesting that 60% of the respondents stated that they knew “not much” about their topic before they started.  All but one student felt that their knowledge levels increased as they progressed. In the case of Student I the levels increased from “not much” to “a great deal”. Given that these surveys were administered over a four week period you would assume that these researchers would have been continuing to research outside of these workshops and would have been acquiring more knowledge about their topic which also would account for the increases.

Research Skills the Students Founds Easy to Do

In Question 3 and 4  I asked the students to identify what they did well or not well when researching and compared this to the Australian and New Zealand Information Literacy Standards. Many of the comments centred on the locating of resources. The above graph reflects a decline in this in Q.2 and Q.3. However I believe this is because the students’ responses were overall shorter in Q.3. Many of the students commented on completion of the course that they had learnt a great deal about how to find information more effectively which contradicts the responses. It may be that the some of the students still lacked confidence in their abilities. The ability to critically evaluate information improved significantly by Q.3 with students stating they were able to find good sources to support their research. One student stated that he was able to “locate, comprehend and apply new research to support the arguments”. Applying prior and new information to create new understanding also saw an improvement by Q.3 with students acknowledging that they were able to connect new ideas to identify different directions to take with their research. Many of the responses were very short with only one or two sentences. I believe I should have given the students more direction with this question and the next which may have resulted in more detailed responses.

Research Skills the Students Found Difficult to Do

Similarly when asked what students found difficult to do most of their responses centred on the locating of sources, critical analysis, managing their information and the creation of new understanding. By Q.3 most students identified searching for information effectively as no longer a major issue which demonstrates the effectiveness of the learning activity. Several students commented that keeping track and organising their research was an issue. Other comments such as “it is difficult to extract related concepts and apply them” indicate that creating new understanding was an issue. It would be hoped that by Q.3 there would be less statements of research difficulty and with the exception of the evaluation of sources column this is supported by the above graph. This is at odds with the data recorded on the specific Evaluation question (to which I will refer later on) in which all students recorded an improvement in their source evaluation skills by Q.3. It is also interesting to note that the students’ perception of what they did and did not do well heavily centered on critically evaluating sources.

Database Searching Skills

The second questionnaire was delivered after the database session and as expected there is an increase in all but one students’ perceived skill level. In the case of two of the students they reported a significant improvement with their responses changing from ‘not at all’ to ‘quite a bit’. In the case of student C my observation was that she was managing the searches easily although without interviewing her it is difficult to ascertain for certain that she was retrieving appropriate results. In this database workshop the students are demonstrated a multidisciplinary database and given time to practice searching. However some of the students needed assistance as they were not attaining results. This was because the chosen database was not appropriate for their discipline. Intervention by the facilitator and myself directing the students to more appropriate databases yielded success.


Knowledge of Database Search Features and Techniques

In the above question the students were asked what they knew about database search features and techniques. 80% responded ‘not all all’ or ‘not much’. It is positive to note that there was an increase in their knowledge after the session. Student J’s response was very positive. This student required a lot of assistance as she was an older student who was returning to study after a lengthy time. In the workshop the students were taught about Boolean operators. Many of the students requested assistance and clarification when performing their own searches. My observation by the end of the session was that most of them were using Boolean operators effectively.

Citation Searching

As would be expected the majority of students knew very little about citation searching. By Q.3 all of their skill level had increased significantly, in fact this question saw the largest improvement between the questionnaires and most students were very excited by this search feature.  As student A commented “the ability to track the research both forwards and backwards and identify key authors is really useful”. This section required the most assistance from myself and the facilitator – nearly all of the students asked for clarification as this was a new concept for most of them.

Internet Searching

Not surprising that the above results reflect the majority of students’ perceptions that their internet searching skills are good. The module on internet searching is conducted after questionnaire 2 was delivered and this is reflected with the students’ skill level remaining the same between Q.1 and Q.2. It is to be noted that 80% of the students felt their internet searching skills improved after this workshop. In my observation of the student cohort with the exception of students H and J they were all Generation Y and therefore are very familiar with Google etc. It is also worth noting that with the exception of Student J they were all familiar with Google Scholar. In the course of the workshop it became apparent that the students knew very little about different search engines or search refining techniques such as domain searching or advanced search functions. As one student commented to me “I didn’t realise how much more there was to know about internet searching”.


Evaluation of Sources

Evaluation of sources is discussed in the first workshop and the above graph reflects that students’ knowledge and understanding had increased before completing Q.2 but with no further improvement after that. This is the only question in which Student C recorded any different response between the Questionnaires.


At the time of this presentation only 2 students had submitted their assessment. Student A’s assessment was extremely thorough and scored very highly. This was consistent with his responses in the questionnaires. In all questionnaires he consistently rated his abilities in the highest category and throughout the workshops he never sought assistance or clarification with the exception of his comment on citation searching.

Student J’s assessment was good with her obtaining a pass in all sections.  The assessments are graded and returned to the student with feedback and the option to resubmit. Student J did not want to resubmit however she did reply commenting very favourably on the program writing “I was completely floundering before I did this……thank you I have learnt so much”.

It was unfortunate that there were no other assessments to review. It would have been useful to be able to evaluate their assessment and compare to the observations and their responses.

Overall Analysis

Many students when handing me their final questionnaire at the completion of the program commented on how useful it was. The data analysis of the questionnaires confirms that there was a definite improvement in the students’ perception of their searching abilities. And this was supported by my own observations of them throughout the workshops. My main concern throughout was that the delivery of these workshops which is a combination of lectures, demonstrations and hands on practice would not be entirely effective. However the data, my observations, assessment evaluation and comments from the students would indicate that this was a successful information learning activity.

Methodology – Stage 2


The Advanced Information Retrieval Skills program is mandatory for all higher degree research students to complete. The course consists of twelve hours of instruction and hands on workshops conducted by a university library.

A questionnaire was prepared and presented to the students at three stages throughout: at the commencement, at the end of the second workshop and on completion of the final workshop. The questionnaires were based on the Student Learning through Inquiry Measure (SLIM) questionnaire developed by Todd, Kuhlthau and Heinstrom with some modifications.

  1. Take a couple of minutes to reflect on your topic and write down what you know about it.
  2. How much do you know about this topic?
  3. When you do research what do you generally find easy to do?
  4. When you do research, what do you generally find difficult to do?
  5. How competent are you with database searching?
  6. How much do you know about using database search features to refine your search such as phrase searching, truncation, search fields etc?
  7. How much do you know about Citation searching?
  8. How familiar are you with Internet searching?
  9. How knowledgeable are you about evaluation criteria of sources?

Questions 5 through to 9 were specific to this learning activity which is to develop advanced research skills.

In questions 1,3 and 4 the students were provided with room to provide detailed answers whilst in the other questions the students were asked to select from four provided scaled responses.

Not at all                     Not much                    Quite a bit                   A great deal

The answers were given scores ranking from 1 for Not at All to 4 for A great deal. The results were then coded according to the method provided by the SLIM toolkit. Anecdotal data was also obtained by observation of the students and evaluation of their assessment.

Questionnaire 2

Questionnaire 2

  1. Take some time to think about your topic. Now write down what you know about it.

I’ve begun to have a good overview of IBL and its related topics such as PBL and how they are being used in a university setting. I am familiar with some of the IBL models although I feel my understanding is still quite superficial.

2. How interested are you in this topic?  Check (ü) one box that best matches your interest.

Not at all    not much ☐    quite a bit ☐    a great deal ☐ X


3. How much do you know about this topic?  Check (ü) one box that best matches how much you know.

Nothing      not much     quite a bit ☐X    a great deal

 4. Thinking of your research so far – what did you find easy to do? Please mention as many things as you like.

Database/search engine searching is relatively easy for me as I do it all the time professionally.

5.  Thinking of your research so far – what did you find difficult to do? Please mention as many things as you like.

I still find it difficult to stay on task. My searching brings up a lot of related articles but not always specific to my topic. However it is all interesting reading so I can quite often lose a couple of hours without being really productive.

6.  How do you feel about your research so far? Check (ü) one box that best matches how you feel.

Frustrated – I can’t find what I want        

Overwhelmed – I’m finding it hard to sort through the information

Confused – I don’t really know what I’m looking for

Confident – I think I know where I’m heading ☐X