Powered by TWiki
RRSA > WebEvents > EventDocs > Publications
TWiki webs: Main | TWiki | Sandbox   Log In or Register

RRSA Home | Versions | Features | Documentation | Contact | Events | VAS | Changes | Research | Topic Index | RRSA Quick Links

Research Readiness Self-Assessment (RRSA): Publications, grants and presentations

Chan, C. (2016, upcoming). Institutional Assessment of Student Information Literacy Ability: A Case Study. Communications in Information Literacy, 10(1).

Wong, S. C. ( 01/04/2015 - 31/03/2018). Enhancing information literacy in Hong Kong higher education through the development and implementation of shared interactive multimedia courseware ( 6,197,000 Hong Kong Dollars). Information literacy project proposal submitted by the eight publically-funded universities in Hong Kong funded by the University Grants Council, Hong Kong. Participating institutions: The Chinese University of Hong Kong, City University of Hong Kong, Hong Kong Baptist University, The Hong Kong Institute of Education, The Hong Kong Polytechnic University, The Hong Kong University of Science and Technology, Lingnan University, and The University of Hong Kong.

Swartz, B., Ratcliff, A., & Ivanitskaya, L. (upcoming, 2015). Correlation of attitudes and beliefs with actual abilities of speech language pathology students regarding aspects of information literacy. Contemporary Issues in Communication Science and Disorders.

Leung, P. & Chan, C. (2015). Institutional Level Assessment of Student Information Literacy Competencies Using the RRSA. A presentation at the Joint University Libraries Advisory Committee Forum at The Hong Kong Institute of Education. Abstract: Since 2011, both Lingnan University Library and Hong Kong Baptist University Library have adopted the Research Readiness Self-Assessment (RRSA) to benchmark the information literacy (IL) competency levels of first-year students. Using the latest results for the 2014-15 cohort, we will compare our students’ self-belief in their research ability and their actual research skills. We will also highlight their specific IL strengths and weaknesses by analyzing their answers to individual questions of the assessment. In addition, HKBU Library will share the IL competencies results from a representative sample of senior students. Hopefully, this presentation will share our experience in the assessment of IL and identify key aspects for enhancing students’ IL skills. URL: http://www.julac.org/?page_id=4358 Download PPT slides

Jackson, C. (2013). Confidence as an indicator of research students’ abilities in information literacy: A mismatch. Journal of Information Literacy, 7(2), pp. 149-152. Results & Conclusions: An analysis of the anonymised results of the 130 completed tests showed that there was a mismatch between students’ perceived IL and the scores achieved in the objective questions asked by the RRSA test. The RIN report on supervisors’ roles in developing students’ IL encourages supervisors to support and discuss their students’ skills (2009). The importance of this support from supervisors, and finding ways to objectively assess students’ skills, is highlighted by the findings of our diagnostic skills assessment, suggesting that students lack the awareness to assess their skills level without guidance. These findings also raise questions over IL tests that use confidence as a predictor of IL ability, suggesting such methodology needs reconsideration.
URL: http://dx.doi.org/10.11645/7.2.1848

Ratcliff, A., Swartz, B., & Ivanitskaya, L. (2013). Information literacy skills of students in a communication disorders training program. Contemporary Issues in Communication Science and Disorders, 40, 31-39. Abstract: Purpose: An increasingly important aspect of university programs in communication sciences and disorders is making students aware of evidence-based practice (EBP). One of the components of EBP is information literacy, or the ability to conduct effective information searches and examine quality indicators of the information found. The information literacy skills of communication sciences and disorders students at 3 academic levels in 1 training program were measured to determine the nature and extent of any differences in the skills across academic levels. Method: The communication sciences and disorders version of the Research Readiness Self-Assessment (RRSA; Ivanitskaya, O’Boyle, & Casey, 2006) was administered to 150 students enrolled in sophomore-, senior-, and graduate-level courses in a communication sciences and disorders program. This web-based assessment measures students’ skills in finding and evaluating information from a variety of sources related to speech-language pathology. The results were analyzed to determine how information literacy differed across the 3 academic levels. Results: There were significant differences among student groups in the objectively measured skills related to searching for and evaluating information. For the most part, graduate-level students demonstrated the highest skills, followed by senior- and sophomore-level students. Conclusion: Implications for communication sciences and disorders training programs are discussed, as are ways to facilitate students’ information literacy skills.. URL: http://www.asha.org/uploadedFiles/ASHA/Publications/cicsd/2013S-Information-Literacy-Skills.pdf

Ivanitskaya, L. V., Hanisko, K. A., Janson, S. J., Garrison, J. A., & Vibbert, D. (2012). Developing health information literacy: A needs analysis from the perspective of pre-professional health students. Journal of Medical Library Association, 100(4), 277-283. doi: 10.3163/1536-5050.100.4.009. Abstract: OBJECTIVE: To identify the skills, if any, that health pre-professional students wish to develop after receiving feedback on skill gaps and the strategies they intend to use to develop these skills. METHODS: A qualitative approach was used to elicit students’ reflections on building health information literacy skills. First, the students took the Research Readiness Self Assessment instrument, which measured their health information literacy, and then received individually tailored feedback about their scores and skill gaps. Second, students completed a post-assessment survey asking how they intended to close identified gaps in their skills on these. Three trained coders analyzed qualitative comments by 181 students and grouped them into themes relating to “what skills to improve” and “how to improve them”. RESULTS: Students intended to develop library skills (64% of respondents), Internet skills (63%) and information evaluation skills (63%). Most students reported that they would use library staff members’ assistance (55%) but even more respondents (82%) planned to learn the skills by practicing on their own. Getting help from librarians was a much more popular learning strategy than getting assistance from peers (20%) or professors (17%). CONCLUSIONS: The study highlighted the importance of providing health pre-professional students with resources to improve skills on their own, remote access to library staff members, and instruction on the complexity of building health literacy skills, while also building relationships among students, librarians and faculty. URL: http://www.ncbi.nlm.nih.gov/pubmed/23133327

Long, J., & Gannaway, P. (2012). Improving Research skills of Registered Nurses: Effects of a Web-based tool. A peer-reviewed poster presented at the 2nd Annual Improvement Science Summit: Advancing Healthcare Improvement Research.

Ivanitskaya, L., Albee, P., & Janson, S. J. (2012). An Adaptive eHealth Information Literacy Assessment for Pre-Professional Health Students. A poster presented at Medicine 2.0, The 5th World Congress on Social Media, Mobile Apps, and Internet/Web 2.0 in Health, Medicine and Biomedical Research at Harvard Medical School, Boston, MA. Abstract: Background: Research Readiness Self-Assessment is an online interactive tool that tests eHealth information literacy competencies. In addition to assessing skills and knowledge, it captures other related competencies, such as self-reported beliefs about health media and its limitations. After an assessment taker responds to all questions and completes interactive exercises, she receives individually tailored, automated feedback about her eHealth information competencies. The participant also receives suggestions on how to further build these competencies. Objective: To convert an existing tool, which is a hybrid of a test and a survey, into an adaptive test. The original instrument takes between 30 and 45 minutes to complete. Its participants answer the same questions in the same order. In contrast, an adaptive test will contain fewer items and will take less time to complete. It will adjust to the participants’ ability level and will not give questions that are too hard or too easy for this person. In most adaptive tests, the first item is picked at a medium-difficulty level for the test population. A correct response to this item is followed by a more difficult item; an incorrect response is followed by an easier item. If the difficulty of the first item is estimated with greater accuracy, less test items will be needed to obtain a satisfactory level of precision in measuring one’s eHealth information skills. The intent is to use self-reported measures that are predictive of skills to improve our estimate of the difficulty of the first test item. Methods: The first step is to validate the original instrument and the underlying eHealth information competency model. Content validity is established through a literature review and subject matter expert (SME) evaluations. SMEs are 25 medical librarians and health faculty who complete the original assessment and provide detailed feedback about its items and the overall design. Construct validity evidence is gathered by comparing scores by individuals at different levels of research experience. Reliability estimates are obtained for items included in the following scales: objectively measured eHealth information skills; self-reported beliefs about health media; research and library experience; and computer skills. Inter-scale correlations are obtained. A regression model is used to predict objectively measured skills based upon self-reported scales. The Rasch model is used to estimate difficulties of all test items. Results and Conclusions: Our preliminary findings offer evidence in support of content and construct validity of the original assessment; scale reliabilities range between .79 and .87. Experience, education level, computer skills and especially beliefs about health media predict eHealth information literacy scores. These scores are measured objectively through a series of exercises that simulate health information searches and document evaluation. Therefore, there is an opportunity to design an adaptive test that can be completed within a short time because it selects the first test item based on what is already known about the assessment taker. This knowledge is derived from the assessment taker’s responses to self-reported measures that predict eHealth information literacy scores. URL: http://www.medicine20congress.com/ocs/index.php/med/med2012/paper/view/798?trendmd-shared=0

Long, J., & Gannaway, P. (2011). Development of a Multi-Disciplinary, Evidence-Based Tool to Enhance Student Research Skills and Critical Thinking Decision Points Using the World Wide Web and a Mobile Phone App. A peer-reviewed poster presented at Sigma Theta Tau International (STTI). Abstract: Purpose: Technological advances have changed how students and faculty access information. Research suggests both groups may overestimate their research and critical thinking skills in acquiring and judging the trustworthiness of the scientific literature. The purpose of this project is to facilitate evidence-based (EB) research skills by creating a tool to enhance critical thinking decision points. The EB tool will be accessed through the WWW and a mobile phone App. Methods: This pilot project uses a descriptive, mix-method research design. The research questions for this study are 1) Does the use of an EB tool increase the effectiveness of student research skills? 2) Does the use of an EB tool increase student critical thinking as expressed in writing? A literature review of research methods across disciplines was conducted to identify common processes encompassing EB definitions, processes, and methods. An EB tool was constructed which uses links to existing web resources and prompts the user to systematically evaluate critical decision points while researching and evaluating information sources for the purpose of writing a quality research paper. The tool will be placed on the university library website and a mobile App link created. A group of student and faculty will take the Online Research Readiness Self-Assessment before and after using the tool and faculty and student focus groups conducted to determine usability and effectiveness of the EB tool. Results: Data will be collected from a pilot group of students and faculty using the EB tool in the spring of 2011. Conclusion: Nursing is advancing the EB practice movement in academia. The use of an EB tool to enhance critical thinking decision points may assist students and faculty with research skills and critical thinking as expressed in writing. The authors acknowledge the support for this study provided by an Institutional Small Grant through EquipLCU . URL: http://www.nursinglibrary.org/vhl/handle/10755/201898 .

Belcik, K. D. (2011). Information literacy of Registered Nurses at magnet hospitals. Doctoral Dissertation, The University of Texas at Austin. Abstract: Introduction: More patients are turning to the Internet as a source of health information. Nurses occupy the frontline of healthcare and must have information literacy (IL) competencies to guide themselves and their patients to the correct and appropriate health information on the Internet. Within magnet hospitals, which are exemplars for excellent nursing practice, there is an increased emphasis on EBP and research, which requires IL. Exploring IL at magnet hospitals was reasonable considering such competence is promoted. Nurses lack IL competencies which are necessary to inform their patients and impact healthcare. The purpose of this research study was to objectively measure the information literacy competencies of registered nurses, specifically their competencies in accessing and evaluating electronic health information, self-perception of information literacy, reliance on the Internet, and the relationship among these competencies. Method: A convenience sample of 120 registered nurses, at four magnet hospitals, completed the Research Readiness Self-Assessment—Nurse (RRSA-Nurse), an interactive online instrument and a demographic data form. Data were analyzed using descriptive, correlational, and regression statistical methods. Results: Nurses employed at magnet hospitals had a high ability to access and evaluate health information and high overall IL. Their self-perception in their abilities to access and evaluate health information was high and a majority did not rely on the Internet solely for health information. Seven variables were significantly correlated to overall information literacy including role, PhD /MSN nursing education, ability to access health information, ability to evaluate health information, library and research experience, contact with library staff, and library use. Lack of exclusive reliance on the Internet for health information and PhD /MSN nursing education were major predictors of information literacy. Discussion & Conclusions: Further research is necessary to explore qualities within magnet hospitals that contribute to the promotion of information literacy competencies in nurses. Understanding these qualities may assist with the development of interventions to increase information literacy among practicing nurses.

Ivanitskaya L., Billington, A., Hanisko, K., Janson S., Erofeev D. (April, 2011). Information literacy of health students: Assessment and interventions. Two research studies presented in a 45-minute session at the LILAC - Librarians’ Information Literacy Annual Conference, British Library & London School of Economics Library, London, England. Abstract: Due to the proliferation of health information available on the internet and in electronic databases, future healthcare professionals need to assess information needs; navigate different sources and channels; identify information gateways and access barriers; discern among different types of health information; use electronic health tools; and assess relevance, accuracy and completeness of the information they find. These tasks require e-health literacy. Do students and experts have different levels of e-health literacy? Do skills vary by academic discipline? Is e-health literacy little more than computer skills? How can educators increase the quality of bibliographical references cited in student papers? Students at a Midwestern US university (n=765) from a variety of specialization backgrounds and health information experts (n=19) completed an online interactive Research Readiness Self-Assessment (RRSA), health version. The experts were health science librarians and health faculty from 12 academic institutions in the US. RRSA contained 40 items that measured skills related to finding and evaluating health information from electronic sources, such as the library databases and the open access internet. Upon answering all questions online, students received individually tailored feedback on their perceived vs. actual skills. Some students (n=354) could take RRSA and participate in a 30-minute library instruction session about using library databases. The impact of both interventions was evaluated against the number of bibliographical references cited in student papers submitted at the end of the semester, weighted by source quality (the dependent variable or DV). There were statistically significant differences in skills when students of different majors were compared to experts. The differences remained after controlling for years of education. Nutrition majors had significantly higher scores than other majors. Students who did both library instruction and RRSA cited significantly better sources than students who did one or none of the activities. Library instruction accounted for 18% of variance in the DV and RRSA feedback accounted for an additional 6% of variance. Computer skills were measured using a scale adapted from Hargittai (2005); they did not explain a significant amount of variance in the DV. The relationship between computer skills and RRSA scores was weak to moderate ranging from r=.17 to r=.34, p<.05 across all fields of study. About 70% of students thought that their skills were good but learned from RRSA that they could be improved. Computer skills are useful for building health information literacy skills, however, they do not fully explain an individual’s ability to find and evaluate health information from electronic sources. Educators of pre-professional health students should partner with health science librarians to develop health information skills of their students. Because RRSA offers evidence of weakest skill areas, it can be used as a needs assessment tool. Because it provides individually tailored feedback that contrasts expected and actual skill levels, it is also an intervention. Students who were exposed to two interventions—a library instruction session and feedback on information skills—cited more and higher quality of sources in their academic assignments than students exposed to only one or none of the interventions.

Ivanitskaya L, Brookins-Fisher J, O'Boyle I, Vibbert D, Erofeev D, Fulton L. Dirt Cheap and Without Prescription: How Susceptible are Young US Consumers to Purchasing Drugs From Rogue Internet Pharmacies? J Med Internet Res 2010;12(2):e11, URL: http://www.jmir.org/2010/2/e11/ doi: 10.2196/jmir.1520; PMID: 20439253. Abstract: Websites of many rogue sellers of medications are accessible through links in email spam messages or via web search engines. This study examined how well students enrolled in a U.S. higher education institution could identify clearly unsafe pharmacies. The aim is to estimate these health consumers´ vulnerability to fraud by illegitimate Internet pharmacies. Two Internet pharmacy websites, created specifically for this study, displayed multiple untrustworthy features modeled after five actual Internet drug sellers which the authors considered to be potentially dangerous to consumers. The websites had none of the safe pharmacy signs and nearly all of the danger signs specified in the Food and Drug Administration´s (FDA´s) guide to consumers. Participants were told that a neighborhood pharmacy charged US$165 for a one-month supply of Beozine, a bogus drug to ensure no pre-existing knowledge. After checking its price at two Internet pharmacies—$37.99 in pharmacy A and $57.60 in pharmacy B—the respondents were asked to indicate if each seller was a good place to buy the drug. Responses came from 1,914 undergraduate students who completed an online eHealth literacy assessment in 2005-2008. Participation rate was 78%. In response to "On a scale from 0-10, how good is this pharmacy as a place for buying Beozine?" many respondents gave favorable ratings. Specifically, 50% of students who reviewed pharmacy A and 37% of students who reviewed pharmacy B chose a rating above the scale midpoint. When explaining a low drug cost, these raters related it to low operation costs, ad revenue, pressure to lower costs due to comparison shopping, and/or high sales volume. Those who said that pharmacy A or B was "a very bad place" for purchasing the drug (25%), as defined by a score of 1 or less, related low drug cost to lack of regulation, low drug quality, and/or customer information sales. About 16% of students thought that people should be advised to buy cheaper drugs at pharmacies such as these but the majority (62%) suggested that people should be warned against buying drugs from such internet sellers. Over 22% of respondents would recommend pharmacy A to friends and family (10% pharmacy B). One-third of participants supplied online health information to others for decision-making purposes. After controlling for the effects of education, health major, and age, these respondents had significantly worse judgment of Internet pharmacies than those who did not act as information suppliers. At least a quarter of students, including those in health programs, cannot see multiple signs of danger displayed by rogue Internet pharmacies. Many more are likely to be misled by online sellers that use professional design, veil untrustworthy features, and mimic reputable websites. Online health information consumers would benefit from education initiatives that (1) communicate why it can be dangerous to buy medications online and that (2) develop their information evaluation skills. This study highlights the importance of regulating rogue Internet pharmacies and curbing the danger they pose to consumers.

Ivanitskaya, L. (06/06/2010-08/28/2010). Supplement to R03 ES17401-01A2: Assessing health information consumers’ competencies in managing digital media: An evaluation of an online interactive assessment ($67,920). NOT-OD-09-060: Administrative Supplements Providing Summer Research Experiences for Students and Science Educators.

Hatschbach, M. H. L. (December, 2009). Information literacy among higher education Students of Tourism in Brazil. Doctoral Thesis (Doctorate in Information Science). Universidade Federal Fluminense-UFF/Instituto Brasileiro de Informação em Ciência e Tecnologia-Ibict, Rio de Janeiro. Abstract: This research is about information literacy in higher education, taking Tourism as study field. In so doing it points out to the several interfaces that exist between Tourism and Information Science. The development of Tourism is associated with the increase in use of new information and communication technologies and with the corresponding need of information competent human resources. An analysis of the Brazilian Tourism sector and an overview of higher education in the field are presented in this study. The development of information literacy studies internationally, in Latin America and especially in Brazil is also considered here, with a thorough review of Brazilian studies from 2001, when the first articles are written in Information Science Journals. The empirical study involves an Information Literacy online test applied to the undergraduate students of Tourism of the Universidade Federal Fluminense (UFF), in Rio de Janeiro. The Research Readiness Self-Assessment (RRSA), developed at the Central Michigan University, was translated and adapted to the Brazilian environment. Data analyses of the 48 questions included indicate, among other factors, that Brazilian students show a satisfactory comprehension of some dimensions of information literacy but also reveal that some specific aspects should be improved. Deficiencies in Information literacy can compromise the perspective of Brazilian professionals in the Information society and suggest that a special attention need to given to these needs by higher education professionals and institutions. Portuguese: Hatschbach, Maria Helena de Lima. A Competência em Informação de Estudantes de Graduação em Turismo: Um estudo de caso no Brasil . 2009. Tese (Doutorado em Ciência da Informação) – Universidade Federal Fluminense-UFF/Instituto Brasileiro de Informação em Ciência e Tecnologia-Ibict, Rio de Janeiro. Orientador: Gilda Olinto.

Zehner, D. C. B. (November, 2009). Factors affecting information literacy perception and performance. Doctoral Dissertation, University of South Carolina, & Poster presented at the Biennial American Assocaition of School Librarians' Conference in Charlotte, NC. Abstract: Information literacy, defined as “the set of skills needed to find, retrieve, analyze, and use information” (American Library Association, 2003, ¶ 1), is necessary for success in life. The present study examines whether the factors of gender, race, and/or socioeconomic status impact information literacy performance and information literacy perception of skill as measured by the Research Readiness Self-Assessment (RRSA-lib). After being informed of their perceived skill and their actual performance, 170 high school participants were asked if this information changes the likelihood they would seek further information literacy-related instruction. Results were obtained using multivariate analysis of variance (MANOVA). Participants who were classified as White performed higher on one performance subscale, Obtaining Information. Whites also performed significantly lower on the Reliance on Browsing the Internet (vs. the library) subscale, indicating they rely less on web browsing to find information than non-Whites. There were no significant main effects with regards to gender or socioeconomic status on any subscales. The majority of participants (n = 107) indicated that they would seek further information literacy instruction after seeing their RRSA scores and feedback. Additionally, the study participants performed significantly better on most subscales than a group of first year college students at a four year university. The results obtained inform school, college, and public librarians about factors affecting perception and performance and can be used to design appropriate interventions for each group. Analysis of participants’ willingness to seek further training can also be used to inform librarians as they seek to create an information literate population.

Swartz, B. L., Ratcliff, A., Ivanitskaya, L. (November, 2009). Pre-Professional Students' Attitudes and Beliefs Regarding Their Information Literacy Skills. Poster presented at the Annual American Speech-Language-Hearing Association (ASHA) Convention in New Orleans, LA.

Ivanitskaya, L., Brookins-Fisher, J., O’Boyle, I., Vibbert, D., Erofeev, D., & Cooper, A. (October, 2009). Dirt cheap and without prescription: How susceptible are U.S. consumers to purchasing drugs from illegitimate internet pharmacies? Paper and poster presented at the Health Literacy Annual Research Conference in Washington, DC. Abstract: Background. Websites of many rogue sellers of medications are accessible through links in email spam messages or via web search engines. This study examined how well students enrolled in a U.S. higher education institution could identify unsafe pharmacies by paying attention to signs specified in the Food and Drug Administration’s (FDA’s) guide to consumers. Objectives. The aim is to estimate these health consumers’ vulnerability to fraud by illegitimate Internet pharmacies. Methods. Two Internet pharmacy websites, created specifically for this study, displayed multiple untrustworthy features modeled after five actual Internet drug sellers which the authors considered to be potentially dangerous to consumers. The websites had none of the safe pharmacy signs and nearly all of the danger signs specified by the FDA. Participants were told that a neighborhood pharmacy charged US$165 for a one-month supply of Beozine (a bogus drug to ensure no pre-existing knowledge). After checking its price at two Internet pharmacies—$37.99 in pharmacy A and $57.60 in pharmacy B—the respondents were asked to indicate if each seller was a good place to buy the drug. Responses came from 1,914 students who completed an online eHealth literacy assessment in 2005-2008. Participation rate was 78%. Results. In response to “How good is pharmacy [A or B] as a place for buying Beozine?” many respondents gave favorable ratings. Specifically, 50% of students who reviewed pharmacy A and 37% of students who reviewed pharmacy B gave ratings between 5 and 10 on a 10-point scale. Less than 25% said that pharmacy A or B was “a very bad place” for purchasing the drug, as defined by a score of 1 or less. They related low drug cost to lack of regulation, low drug quality, and/or customer information sales. Favorable raters tended to say that it was because of low operation costs, ad revenue, pressure to lower costs due to comparison shopping, and/or high sales volume. About 16% of students thought that people should be advised to buy cheaper drugs at pharmacies such as these but the majority (62%) suggested that people should be warned against buying drugs from such internet sellers. Over 22% of respondents would recommend pharmacy A to friends and family (10%—pharmacy B). One-third of participants found online health information to help others make health decisions or made such decisions for themselves. After controlling for the effects of education, health major, and age, these respondents had significantly worse judgment of Internet pharmacies than those who did not use online information for health decision-making. Conclusions. At least a quarter of students, including those in health programs, cannot see multiple signs of danger displayed by rogue Internet pharmacies. More will be misled by sellers that use professional design, veil untrustworthy features, and mimic reputable websites. Online health information consumers would benefit from education initiatives that develop skills in finding and evaluating online health information. Also, much work is to be done in regulating rogue Internet pharmacies and curbing the danger they pose to consumers.

Ivanitskaya, L., Brookins-Fisher, J., Garrison, J., Heuberger, R., Leonard, W., & O’Boyle, I. (06/03/2009-05/31/2011). R03 ES17401-01A2: Assessing health information consumers’ competencies in managing digital media: An evaluation of an online interactive assessment ($141,500). Understanding and Promoting Health Literacy [PAR07-019] grant awarded by the National Institutes of Health (NIH). Priority score: 138.

Redmond, T., Ivanitskaya, L., Hollis, R., & Hickson, D. (May 2009). Health Information Competency: A Comparative Assessment of Rural and Non-Rural Students’ Knowledge and Skills Related to Managing of Electronic (Digital) Health Information. A poster presented at the 8th Annual Health Literacy Conference ("Health Literacy: Bridging Research and Practice") by the Institute for Healthcare Advancement (IHA) in Irvine, CA. The poster was based on dissertation research by Redmond (2007, see its abstract below). Nominated for an IHA Health Literacy Award in the Research category.

Boucher, C., Dalziel, K., Davies, M., Glen, S., & Chandler, J. (March 2009). Are Postgraduates Ready for Research? Poster presented at the Librarians Information Literacy Annual Conference (LILAC) at Cardiff University, Wales, U.K. Summary of conclusions: Students’ self-perception of their Information Literacy skills is not always a reliable basis upon which to determine training needs in this area. Student feedback indicates that RRSA is a useful tool in realistically assessing skills and training needs. Students who took RRSA were nearly twice as likely to attend library training as those who hadn’t. Using RRSA can help to raise awareness of, and attendance at, Information Literacy training sessions for research postgraduates. Poster: http://rrsa.cmich.edu/documents/RRSA_poster_LILAC_2009.pdf

Swartz, B. L., Ratcliff, A., Ivanitskaya, L. (November, 2008). Evidence-Based Practice: Status of Pre-professional Student Information Literacy Skills. Poster presented at the Annual American Speech-Language-Hearing Association (ASHA) Convention in New Orleans, LA. Summary of conclusions: As they progress through their education programs, Communication Disorders students demonstrate significantly better skills in selected areas of information literacy. Despite the positive association between information literacy and years of education, there is still room for improving information literacy skills of students at all levels. To incorporate information literacy into the curricula of pre-professional students, information literacy skills should be formally assessed throughout the curriculum. Standards of information literacy should be integrated into activities, assignments, courses, and the curriculum as a whole by considering students’ learning patterns and preferences.

Ivanitskaya, L., & Beehr, T. (May 2008). Electronic Visual Analog Scales: Instant Feedback Prompts Raters to Select Discrete Numbers. Poster and paper presented at the 20th Annual Convention of the Association for Psychological Science (Brain, Body, Behavior, and Health track), in Chicago, IL. Abstract: Paper-based Visual Analog Scale (VAS) measures are common in the assessment of pain, global health, and quality of life; they are used in both inpatient and outpatient settings. On a VAS, the respondent simply makes a mark anywhere along a line that indicates a continuum from low to high. VAS can be seen as a special case of a Likert scale in which there are potentially an unlimited number of response categories. Recently, electronic VASs have been implemented in data collection via palmtop computers, but their design was inconsistent. Moreover, the impact of their new features was not systematically evaluated. We tested the impact of two design elements of an internet-based VAS on respondents’ ratings: the a priori presence of a scale mid-point and immediate visual feedback to raters about their exact score as they are choosing it. The experimental design was 2 (midpoint or no midpoint indicated on the scale) x 2 (feedback on a three-decimal rating score chosen by the person versus no specific feedback). We hypothesized that use of immediate visual feedback with three decimals would lead participants to choose whole numbers and to avoid complex fractions, thereby providing scores more like a typical Likert scale (i.e., providing only a few discrete numbers and reducing an advantage of continuous scaling). In addition, we hypothesized that use of a midpoint would lead participants to chose numbers in the middle of the scale, which would result in reduced variance and, consequently, in reduced reliability. Quantitative data were collected online from 326 individuals who were randomly assigned to experimental conditions and who provided VAS ratings on self-reported health and nutrition, self-appraisal of health information literacy and beliefs about the use of internet search engines to find health-related information. The data analyses focused on an evaluation of distributions and estimates of the internal consistency reliability. In midpoint conditions, participants chose numbers close to the midpoint more often than in conditions when midpoint was not marked, however, this difference was not statistically significant. When feedback was provided, participants chose discrete numbers and the least complex fractions (half-points, such as 1.5 or 6.5) significantly more often than when feedback was not provided. Presence of a midpoint did not significantly affect avoidance of complex fractions. Although both feedback and a midpoint were associated with reduced variance, a midpoint contributed the most to the reduction of scale reliability (Cronbach’s alpha). The smallest variance was observed for variables measured under the midpoint only condition. In the literature on math education, a fraction avoidance syndrome is defined as inability to deal with the non-integer relationships. Information processing theories and concepts were applied to explain our findings, such as fast retrieval of typical or familiar numbers and rapid activation of associative networks; the impact of schemas; cognitive burden; and difficulties in distinguishing among choices when the number of choices is large. In sum, electronic VASs with feedback produce less continuous data than VASs without feedback. Presence of a midpoint leads to lower scale reliability due to decreased variance.

Pival, P.R., Lock, J. V., Hunter, M. (2008). Assessing Research Readiness of Graduate Students in Distance Programs. In G. E. Siegel (Ed.), Libraries and Graduate Students, Haworth Press, Hard Cover/ISBN-13: 978-0-7890-3054-2. Published simultaneously as Public Services Quarterly, 3(3/4), 2007. Abstract: The aim of this descriptive research study was to assess the skill level, confidence, and overall research readiness of selected groups of graduate students (on and off campus) within two divisions housed in the Faculty of Education at the University of Calgary. The researchers expected participants to overestimate the value of the internet as a source of academically reputable information, and to have a limited understanding of the complex nature of online academic research. These expectations were not validated by the conclusions of the assessment.

Mathson, M. S., & Lorenzen, M. G. (2008). We won't be fooled again: Teaching critical thinking via evaluation of hoax and historical revisionist websites in a library credit course. Paper accepted for publication in College and Undergraduate Libraries, v. 15, #1/2. Abstract: At Central Michigan University, librarians teach multiple sections of an eight-week, one-credit research skills class to hundreds of undergraduate students each semester. While the main focus of the course is to teach students how to find, use, and properly cite library resources, librarians also address critical thinking skills by designing lessons to teach World Wide Web organization and how to analyze the information found via search engines. Showing student’s obvious hoax sites about “tree octopi” and “male pregnancy,” introduces the concepts of critical thinking and Website analysis. Most students quickly refute the information on such sites. However, students have a more difficult time assessing social, historical, or political revisionist websites sites’ validity. Contrasting those claims with evidence accepted by international courts, historians, and scientists is useful in pointing out the flaws of seemingly well documented but one-sided revisionist sites. There are dangers in exposing students to these groups via their Websites. Yet, it is important to do so in order to convey the importance of critical analysis of information. The authors discuss students’ pre- and post-test (CMU’s online assessment tool, the Research Readiness Self-Assessment [RRSA]) scores to determine whether critical thinking skills have improved.

Ivanitskaya, L., DuFord , S., Craig M., & Casey, A. M. (2008). How Does a Pre-Assessment of Off-Campus Students’ Information Literacy Affect the Effectiveness of Library Instruction? Journal of Library Administration, 48(3/4), 509-525. Abstract: This study investigates the impact pre-tests have on the effectiveness of library instruction when students are given feedback on their pre-test performance. Librarians and academic faculty partnered to measure library instruction outcomes in two Master’s degree classes. The Research Readiness Self-Assessment (RRSA) was used as a pre-test (before instruction) and a post-test (after instruction) in Class 1 and as a post-test only in Class 2. Students who completed both tests performed significantly better on a post-test, earning higher scores on obtaining information and overall information literacy. They reported greater library/research experience and less reliance on browsing. Compared to students who did not take a pre-test, students who received pre-test-based feedback had higher scores on library/research experience and lower scores on reliance on browsing. To enhance the effectiveness of library instruction students can be given pre-test-based feedback that compares their actual and perceived literacy and encourages the use of library databases. Full text: http://condor.cmich.edu/cdm4/item_viewer.php?CISOROOT=/p1610-01coll1&CISOPTR=3131&CISOBOX=1&REC=4 and PowerPoint Presentation: http://condor.cmich.edu/cdm4/document.php?CISOROOT=/p1610-01coll1&CISOPTR=934&REC=1

Robling, D. K. (2007). Unregulated internet commerce: Student evaluation of online pharmacies. Health Education Monograph Series, 24(2), 15-21. Abstract: The growing utilization of the internet for commerce has created many dangerous situations for consumers as they purchase unregulated products and release their personal information. The purpose of this study is to examine the health literacy of students and their ability to evaluate websites: discerning trustworthy sites from untrustworthy sites. Using the Research Readiness Self Assessment Health version (RRSA-h), students were shown pharmacy websites and asked to search for a drug called Beozine. These websites were modeled from claims, advertisements and other features from existing online pharmacies. To create the model sites, the internet was searched for examples of bad pharmacies, using guidelines and characteristics from an American Cancer Institute fact sheet (2005) on how to evaluate websites. After observing many of these sites, a list was created of features that serve as indicators of low trustworthiness. Evaluation of 741 Central Michigan University students’ scores led to the conclusion that students are not accurate judges of dangerous websites: a quarter of students would recommend the dangerous sample sites to friends and family. However, students who declared health sciences-related majors performed better than students in other disciplines. A role for the health educator is to assist internet users in using the necessary skills to accurately determine the trustworthiness of online information as it is increasingly used to access health information.

Redmond, T. L. (2007). Electronic (digital) health information competency: A comparative analysis of knowledge and skills of rural and non-rural freshman college students. Doctoral Dissertation, Cenral Michigan University. Abstract: According to the Institute of Medicine, approximately 70 million Americans use the internet to search for health information. This exploratory study investigated whether rural and non-rural residents of Georgia who have recently begun their college education and have earned less than 25 credit hours had significantly different electronic (digital) health information competencies, defined as skills and knowledge related to searching and evaluating electronic health resources. An online interactive Research Readiness Self-Assessment (RRSA) tool was used to measure health information competency scores of rural (n = 90) and non-rural (n =153) freshmen. Independent-sample t tests revealed there was a statistically significant difference between the two groups, t (241) = 2.23, p = .03, d = .29, in the ability to obtain electronic health information. Specifically, compared to the rural group (M = 16.18, SD = 4.37), the non-rural group had a higher mean score for obtaining electronic health information (M = 17.41, SD = 4.01). The two groups did not differ significantly in their ability to evaluate health information, t (241) = .14, p = .89, d = .02), and in the overall health information competency, t (241) = .34, p = .18, d = .18). The mean overall competency score was 29 points for rural students and 30 points for non-rural students (out of a maximum possible of 51 points), which indicates that more emphasis on research skills related to searching for and evaluating health information using electronic resources is needed in order to build competency levels. Further regression analysis confirmed that, as compared to non-rural residents, rural residents earned lower scores in the ability to obtain electronic health information. Recommendations for contributing to the increase of health information competencies by students include establishing partnerships among librarians, faculty and writing centers that are available on campus. In addition, the involvement of students in faculty publications along with course objectives that require students to conduct research using empirical literature in various disciples is vital to the development of these skills.

Ivanitskaya, L., O’Boyle, I., & Casey, A. M. (2006). Health Information Literacy and Competencies of Information Age Students: Results from the Interactive Online Research Readiness Self-Assessment (RRSA). Journal of Medical Internet Research, 8(2):e6 Abstract: Background. In an era of easy access to information, university students who will soon enter health professions need to develop their information competencies. The Research Readiness Self-Assessment (RRSA) is based on the Information Literacy Competency Standards for Higher Education, and it measures proficiency in obtaining health information, evaluating the quality of health information, and understanding plagiarism. Objective. This study aimed to measure the proficiency of college-age health information consumers in finding and evaluating electronic health information; to assess their ability to discriminate between peer-reviewed scholarly resources and opinion pieces or sales pitches; and to examine the extent to which they are aware of their level of health information competency. Methods. An interactive 56-item online assessment, the Research Readiness Self-Assessment (RRSA), was used to measure the health information competencies of university students. We invited 400 students to take part in the study, and 308 participated, giving a response rate of 77%. The RRSA included multiple-choice questions and problem-based exercises. Declarative and procedural knowledge were assessed in three domains: finding health information, evaluating health information, and understanding plagiarism. Actual performance was contrasted with self-reported skill level. Upon answering all questions, students received a results page that summarized their numerical results and displayed individually tailored feedback composed by an experienced librarian. Results. Even though most students (89%) understood that a one-keyword search is likely to return too many documents, few students were able to narrow a search by using multiple search categories simultaneously or by employing Boolean operators. In addition, nearly half of the respondents had trouble discriminating between primary and secondary sources of information as well as between references to journal articles and other published documents. When presented with questionable websites on nonexistent nutritional supplements, only 50% of respondents were able to correctly identify the website with the most trustworthy features. Less than a quarter of study participants reached the correct conclusion that none of the websites made a good case for taking the nutritional supplements. Up to 45% of students were unsure if they needed to provide references for ideas expressed in paraphrased sentences or sentences whose structure they modified. Most respondents (84%) believed that their research skills were good, very good, or excellent. Students’ self-perceptions of skill tended to increase with increasing level of education. Self-reported skills were weakly correlated with actual skill level, operationalized as the overall RRSA score (Cronbach alpha = .78 for 56 RRSA items). Conclusions. While the majority of students think that their research skills are good or excellent, many of them are unable to conduct advanced information searches, judge the trustworthiness of health-related websites and articles, and differentiate between various information sources. Students’ self-reports may not be an accurate predictor of their actual health information competencies. Full text: http://www.jmir.org/2006/2/e6/

Burich, N. J., Casey, A. M., Devlin, F. A., & Ivanitskaya, L. (2006). Project Management and Institutional Collaboration in Libraries. Technical Services Quarterly, 24(1), 17-36. Abstract: As most libraries in the United States struggle with declining financial support, combined with the ever-increasing need to incorporate new technology into services and the profusion of resources that are available, inter-institutional cooperation is becoming more common. Planning and implementing new projects in libraries is better carried out if the project is formally managed from the beginning to ensure an orderly and efficient completion to the project. Two institutions have developed new services that they sought to expand to other institutions. Though neither project set out to use project-management techniques, the development of the new services illustrates their practical use.

Ivanitskaya, L., Laus, R., & Casey, A. M. (2004). Research Readiness Self-Assessment: Assessing Students' Research Skills and Attitudes. Journal of Library Administration, 41(1/2), 167-183. Abstract: Librarians and learning researchers at Central Michigan University collaboratively developed an online tool that assesses how student research attitudes and perceptions correlate to their actual research skills in order to educate them about state-of-the-art library resources and prepare them to write high-quality research papers. This article describes the reasons for developing the assessment as well as the design process and technical characteristics. Full text: http://www.haworthpress.com/store/E-Text/View_EText.asp?a=3&fn=J111v41n01_13&i=1%2F2&s=J111&v=41 or http://repository.cmich.edu/u?/p1610-01coll1,1806

Our work was cited in...(stopped tracking in 2010)

  1. Carney PA, Bunce A, Perrin N, Howarth LC, Griest S, Beemsterboer P, Cameron WE. Educating the public about research funded by the National Institutes of Health using a partnership between an academic medical center and community-based science museum. J Community Health. 2009 Aug;34(4):246-54. PMID: 19350373 [PubMed - indexed for MEDLINE] Available online: http://www.springerlink.com/content/13872x7121566751/fulltext.pdf
  2. Eysenbach, Gunther. “Credibility of Health Information and Digital Media: New Perspectives and Implications for Youth." Digital Media, Youth, and Credibility. Edited by Miriam J. Metzger and Andrew J. Flanagin. The John D. and Catherine T. MacArthur Foundation Series on Digital Media and Learning. Cambridge, MA: The MIT Press, 2008. 123–154. doi: 10.1162/dmal.9780262562324.123 Available online: http://www.mitpressjournals.org/doi/pdf/10.1162/dmal.9780262562324.123
  3. Trettin LD, May JC, McKeehan NC. Teaching teens to "Get Net Smart for Good Health": comparing interventions for an Internet training program. J Med Libr Assoc. 2008 Oct;96(4):370-4. PMID: 18974815 [PubMed - indexed for MEDLINE] Available online: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2568845/pdf/mlab-96-04-370.pdf
  4. Breckons M, Jones R, Morris J, Richardson J. What do evaluation instruments tell us about the quality of complementary medicine information on the internet? J Med Internet Res. 2008 Jan 22;10(1):e3. PMID: 18244894 [PubMed - indexed for MEDLINE] Available online: http://www.jmir.org/2008/1/e3/
  5. Willinsky J, Quint-Rapoport M. How complementary and alternative medicine practitioners use PubMed . J Med Internet Res. 2007 Jun 29;9(2):e19. PMID: 17613489 [PubMed - indexed for MEDLINE] Available online: http://www.jmir.org/2007/2/e19/
  6. Cobus L. Integrating information literacy into the education of public health professionals: roles for librarians and the library. Journal of the Medical Library Association (J MED LIBR ASSOC), 2008 Jan; 96(1): 28-33.
  7. Sabo, R. Consumer Health Websites as a Platform for Teaching Evaluation of Internet Content in a Library Instruction Course. MLA Forum, Vol. V, Issue 3, artile 6, June 2007. Available online: http://www.mlaforum.org/volumeV/issue3/article6.html
  8. David, J. Teaching Web 2.0: Media Literacy. March 2009;66(6):84-86. Available online: http://www.ascd.org/publications/educational_leadership/mar09/vol66/num06/Teaching_Media_Literacy.aspx
  9. Liu,R., Lub, Y. Context-based online medical terminology navigation. Expert Systems with Applications. March 2010; 37(2), 1594-1599. doi:10.1016/j.eswa.2009.06.038
  10. Harris, K., McLean , J., Sheffield, J. Solving suicidal problems online: Who turns to the Internet for help? Advances in Mental Health. 2009;8(1):28-36. Available online: http://amh.e-contentmanagement.com/archives/vol/8/issue/1/article/3250/solving-suicidal-problems-online
  11. Liu, R. Text Classification for Healthcare Information Support Lecture Notes in Computer Science, 2007:4570/2007. DOI:10.1007/978-3-540-73325-6
  12. Alan E. Levine, Ph.D., M.Ed.; Richard D. Bebermeyer, D.D.S.; Jung-Wei Chen, D.D.S., Ph.D.; Dell Davis, M.S.I.S.; Carolyn Harty, R.N., M.L.S. Development of an Interdisciplinary Course in Information Resources and Evidence-Based Dentistry. J Dent Educ. 72(9): 1067-1076 2008.
  13. Colepícolo, Eliane. Epistemologia da Informática em Saúde: entre a teoria e a práti-ca. São Paulo, 2008. Universidade Federal de São Paulo. Programa de Pós-graduação em Informática em Saúde. Título em inglês: Epistemology of Medical Informatics: between theory and practice. Available online: http://www.disacad.unifesp.br/sapg/arquivos/arq_55.pdf
  14. M. H. Qurban and R. D. Austria. Public perception on e-health services: implications of prelimary findings of KFMMC for military hospitals in KSA. Paper presented at European and Mediterranean Conference on Information Systems 2008 (EMCIS2008) May 25-26 2008, Al Bustan Rotana Hotel, Dubai
  15. Caroline F. Timmers, Cees A.W. Glas. Developing scales for information-seeking behaviourJournal of Documentation, 2010:66(1), 46-69. DOI 10.1108/00220411011016362
  16. Linda Duffett-Leger, MN, RN, Dr. Barbara Paterson, Dr. Wayne Albert. Optimizing Health Outcomes by Integrating Health Behavior and Communication Theories in the Development of e-Health Promotion Interventions. eHealth International. 2008:4(2);23. Available online at http://ehealthinternational.org/vol4num2/Vol4Num2p23.pdf
  17. Macklin, Alexius Smith, Culp, F Bartow. Information Literacy Instruction: Competencies, Caveats, and a Call to Action. Science & Technology Libraries. Vol. 28, no. 1-2, pp. 45-61. 2008
  18. Suzy A. Iverson, DO, Kristin B. Howard, BA, Brian K. Penney, PhD Impact of Internet Use on Health-Related Behaviors and the Patient-Physician Relationship: A Survey-Based Study and Review. J Am Osteopath Assoc. 2008;108:699-711
  19. Kris S. Freeman and Jan H. Spyridakis. Effect of Contact Information on the Credibility of Online Health Information. IEEE transactions on professional communication, vol. 52, no. 2, June 2009.
  20. J. Mantas; E. Ammenwerth; G. Demiris; A. Hasman; R. Haux; W. Hersh; E. Hovenga;K. C. Lun; H. Marin; F. Martin-Sanchez; G. Wright. Recommendations of the
    International Medical Informatics Association (IMIA) on Education in Biomedical and Health Informatics. (IMIA Recommendations on Education Task Force) 2010
    Methods Inf Med 2010; 49: x-xx. doi: 10.3414/ME5119
  21. Chisolm, D. J. Does Online Health Information Seeking Act Like a Health Behavior: A Test of the Behavioral Model." ; Telemed J E Health. 2010 Feb 16 PMID: 20156127

-- LanaVIvanitskaya - 2010-05-10

 *Ivanitskaya, L. (06/06/2010-08/28/2010). Supplement to R03 ES17401-01A2: Assessing health information consumers’ competencies in managing digital media: An evaluation of an online interactive assessment ($67,920). NOT-OD-09-060: Administrative Supplements Providing Summer Research Experiences for Students and Science Educators.* 
Edit | Attach | Print version | History: r27 < r26 < r25 < r24 < r23 | Backlinks | Raw View | Raw edit | More topic actions

Parents: WebHome > WebEvents > EventDocs
This site is powered by the TWiki collaboration platformCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback