Crafting a literature review on “How do young people evaluate information pertaining sports and fitness” with the help of jenni.ai

It’s finally time to do a literature review on how do young people evaluate information about sports and fitness? – it is not for publication yet, instead to show my colleagues in the department, so we discuss the use of AI generative systems in research.

  • I have already done a Google search (hit many lib. guides), and my Google Scholar search, from where I found around 20 publications (some recent others highly cited).
  • I already imported some of those references in my reference manager – In my case are plain files of .bib files that I host in a Git repository. I have better results with it over Zotero or Mendley, mostly because I use Linux, Latex, write both at home and office computers, and change or upgrade my computers often.
  • I printed, seat down and read a few of those.
  • From the references list of the first printed papers, I highlighted many must-read references. It is the so called snowball effect.
  • I paid already a month’s subscription to Jenni AI, as I found the system quite powerful. And I want to use its full potential. Quite expensive subscription!!
  • I do know there is almost no research on the specific topic. I know however that there is a lot of research in evaluating online resources.
  • I already found some quotes that can be used to say that this research is important.

As stated in some Lib. Guides online in public domain such as this one from the University of Reading …

One common way to approach a literature review is to start out broad and then become more specific. Think of it as an inverted triangle:

So my literature review will be broken down into some mini-literature reviews with different research questions (that will be the prompts for Jenni AI) that go from general to specific.

For the Introduction section:

    1. What is information evaluation?
    2. What other concepts are used to refer to the evaluation of information?
    3. Why is information evaluation important in general?
    4. Why is information in the particular context of sports and fitness important?

For the theoretical background section:

    1. What disciplines or subjects research the evaluation of information?
    2. What models and theories in information behavior address the evaluation of information?
    3. What models and theories in information literacy address the evaluation of information?
    4. Was most of the research on information evaluation conducted in the educational context?
    5. How to evaluate information from online resources?
    6. How do young people evaluate information in everyday life?
    7.  How do young people evaluate information about sports and fitness? 

Note I am going from general to specific, all towards showing that I am aware of existing research and motivate my empirical research.

So it’s now time to go Jenni.ai and finally prompt “What is information evaluation?” …

The system asked me “What are you writing today?”

I answered then “Writing an literature review on what is information evaluation? I am looking for some widely used definitions.” – That was then my starting prompt

🙁 Jenni did not understand it … anyway I asked in the chat then.

Can you give me some references?

🙂 Jenni gave me then following references, still, I need to write to introduce the concept all by self. Anyway I have three references to look into.

  1. Alison J. Head and Michael B. Eisenberg's study on students' information-seeking strategies and research difficulties, as part of Project Information Literacy, which provides insights into how college students assess the credibility of information sources for academic purposes .
  2. Johann Windt, Kerry MacDonald, David R. Taylor, et al.'s work on a decision-making framework for implementing technology in sport, which discusses evaluating technology and its metrics within specific context, also sheds light on the importance of evaluating evidence from all available sources .
  3. Anne Davey and Louise Kim Tucker's examination of how higher education library services can support students' employability by aiding in the evaluation and sifting of quality, authoritative information, which implies the critical nature of information evaluation in career development .
    
    

     

The struggle continues … and … After 90 minutes looking for “What is information evaluation?” with Jenni.ai and google scholar I am exhausted !! And only thing I got was a set of eight reference to start with.

1) Dragulanescu, N.-G. (2002). Website quality evaluations: Criteria and tools. https://doi.org/10.1006/iilr.2002.0205

2) Head, A. J., & Eisenberg, M. B. (2010). Truth be told: How college students evaluate and use information in the digital age. Social Science Research Network. https://doi.org/10.2139/ssrn.2281485

3) M. B. Eisenberg, (2008). Information Literacy: Essential Skills for the Information Age.
DESIDOC Journal of Library & Information Technology,

4) Helms-Park, R., & Stapleton, P. (2006). How the views of faculty can inform undergraduate web-based research: Implications for academic writing. https://doi.org/10.1016/j.compcom.2006.08.003

5) Windt, J., MacDonald, K., Taylor, D. R., Zumbo, B. D., Sporer, B. C., &
Martin, D. T. (2020). “to tech or not to tech?” a critical decision-making
framework for implementing technology in sport. Journal of Athletic Training, 55 (9), 902–910. https://doi.org/10.4085/1062-6050-0540.19

6) Davey, A. and Kim Tucker, L. (2010), "Enhancing higher education students' employability and career management: A library service approach", Library Review, Vol. 59 No. 6, pp. 445-454. https://doi.org/10.1108/00242531011053959 

7) YouTube Assessment as a Means of Documentary Research Students. (2019, October 28). https://dl.acm.org/doi/10.1145/3369255.3369299

Anyway is now time to write the intro based on this references.

After writing  a few sentences I went back to Jenni.AI, and I tried to create headings on the document for the different prompts. I learned that from that moments I talk about information  in the contexts of sports and fitness,  Jenni.AI started ignoring information evaluation in a more general level. Jenni.AI might be more adequante for supporting systematic literature reviews without the inverted triangle.

So I coped with it by creating different documents  on “Information evaluation and information behavior” or “disciplines researching information evaluation”.  Then I realized I could not copy them back to my main document, I would lose the references. Then I learned that working with the chat would work better.

Lesson learned 1)  If you are adopting the inverted triangle strategy for a literature review, do not early reveal the specific context  you are working on early.  Otherwise Jenni.AI will limit your literature review scope.

I changed strategy ans start getting positive results.

I first asked the question mapping a specific subsection of my literature review, Jenni answered by spitting text, then I asked for the referenced back it. The I would write things by myself.

After several iteration I would have some actual AI powered literature review.

After many hours,  it’s lunch time, I exported what Jenni AI wrote with my edits to Latex.  The reference list was looking like this !!  The document was already several pages.

Lesson learned 2) Jenni.AI allows you to save a lot of time by exporting the literature review to a Latex /  biblatex format. That saves a lot of time. I spent so much time importing references from google scholar and publishers websites and load them into my papers and references repository. It also exports for Word and HTML , but I don’t think it links to Zotero, EndNote or Mendeley reference management software.  So I got another reason to use Latex whenever possible.

By the end the morning hours of work, this was the references list outlined by Jenniu.AI.

Jenni.AI proposed references:
  • Credibility of Health Information and Digital Media: New Perspectives and Implications for Youth | Semantic Scholar. (2007, August 12). https://www.semanticscholar.org/paper/Credibility-of-Health-Information-and-Digital-New-Eysenbach/30bb3df178414a69d8948cca1f1abf14c602c875
  • Barbu, M C R., Turcu, I., Sandu, I E., Diaconescu, D., Păsărin, L D., & Popescu, M. (2020, December 31). The impact of technology on the definition of sport. https://doi.org/10.29081/gsjesh.2020.21.2s.01
  • Bermúdez, A L., Cortesi, S., Fieseler, C., Gasser, U., Hasse, A., Newlands, G., & Wu, S J. (2020, January 1). Youth and the Digital Economy: Exploring Youth Practices, Motivations, Skills, Pathways, and Value Creation. https://doi.org/10.2139/ssrn.3622572
  • Bermúdez, A L., Cortesi, S., Fieseler, C., Gasser, U., Hasse, A., Newlands, G., & Wu, S J. (2020, January 1). Youth and the Digital Economy: Exploring Youth Practices, Motivations, Skills, Pathways, and Value Creation. Social Science Research Network. https://doi.org/10.2139/ssrn.3622572
  • Correia, A M R., & Teixeira, J C. (2003, October 1). Information literacy: an integrated concept for a safer Internet. https://doi.org/10.1108/14684520310502261
  • Dewi, N I K., Gede, I G K., Kencanawati, A A A M., & Mataram, I G A B. (2021, January 1). Social Media Usage by Generation Z in Pre-trip Planning. Advances in social science, education and humanities research. https://doi.org/10.2991/assehr.k.210424.036
  • Dragulanescu, N. (2002, September 1). Website Quality Evaluations: Criteria and Tools. International Information & Library Review, 34(3), 247-254. https://doi.org/10.1006/iilr.2002.0205
  • Fairuz, T., Kaniawati, I., & Sinaga, P. (2019, November 1). Enhancing critical thinking skills and information literacy of students through integrated science teaching materials. Journal of physics, 1280(3), 032045-032045. https://doi.org/10.1088/1742-6596/1280/3/032045
  • Graham, R. (2013, December 1). The perception of digital objects and their impact on development. https://doi.org/10.1080/02668734.2013.853908
  • Graham, R. (2013, December 1). The perception of digital objects and their impact on development. Psychoanalytic Psychotherapy, 27(4), 269-279. https://doi.org/10.1080/02668734.2013.853908
  • Haddouche, H., & Salomone, C. (2018, February 21). Generation Z and the tourist experience: tourist stories and use of social networks. , 4(1), 69-79. https://doi.org/10.1108/jtf-12-2017-0059
  • Halliburton, A L., & Weiss, M R. (2002, December 1). Sources of Competence Information and Perceived Motivational Climate among Adolescent Female Gymnasts Varying in Skill Level. Journal of Sport & Exercise Psychology, 24(4), 396-419. https://doi.org/10.1123/jsep.24.4.396
  • Head, A J., & Eisenberg, M B. (2010, January 1). Truth Be Told: How College Students Evaluate and Use Information in the Digital Age. https://doi.org/10.2139/ssrn.2281485
  • Head, A J., & Eisenberg, M B. (2010, January 1). Truth Be Told: How College Students Evaluate and Use Information in the Digital Age. Social Science Research Network. https://doi.org/10.2139/ssrn.2281485
  • Kang, S., Ha, J., & Hambrick, M E. (2015, May 1). A Mixed-Method Approach to Exploring the Motives of Sport-Related Mobile Applications Among College Students. https://doi.org/10.1123/jsm.2013-0065
  • Kim, K., Yoo‐Lee, E., & Sin, S J. (2011, January 1). Social media as information source: Undergraduates' use and evaluation behavior. , 48(1), 1-3. https://doi.org/10.1002/meet.2011.14504801283
  • Li, R., Hassan, N C., & Saharuddin, N. (2023, July 31). College Student’s Academic Help-Seeking Behavior: A Systematic Literature Review. Behavioral sciences, 13(8), 637-637. https://doi.org/10.3390/bs13080637
  • Li, Y. (2012, September 1). Undergraduate students searching and reading Web sources for writing. Educational Media International, 49(3), 201-215. https://doi.org/10.1080/09523987.2012.738013
  • McMillan, S J., & Morrison, M. (2006, February 1). Coming of age with the internet. New Media & Society, 8(1), 73-95. https://doi.org/10.1177/1461444806059871
  • Pak, B C A W P. (2006, December 1). [PDF] Multidisciplinarity, interdisciplinarity and transdisciplinarity in health research, services, education and policy: 1. Definitions, objectives, and evidence of effectiveness. | Semantic Scholar. https://www.semanticscholar.org/paper/Multidisciplinarity%2C-interdisciplinarity-and-in-and-Choi-Pak/3ad0fb9f828360a4989df637d518867009f61796
  • Rieh, S Y. (2001, December 14). Judgment of information quality and cognitive authority in the Web. https://doi.org/10.1002/asi.10017
  • Savolainen, R., & Kari, J. (2004, September 1). Placing the Internet in information source horizons. A study of information seeking by Internet users in the context of self-development. https://doi.org/10.1016/j.lisr.2004.04.004
  • Swanson, R W., & Mayer, J. (1975, May 1). Performing evaluation studies in information science. Journal of the American Society for Information Science, 26(3), 140-156. https://doi.org/10.1002/asi.4630260303
  • Truth Be Told: How College Students Evaluate and Use Information in the Digital Age. (2010, January 1). https://doi.org/10.2139/ssrn.2281485
  • Tury, S., Robinson, L., & Bawden, D. (2015, May 1). The Information Seeking Behaviour of Distance Learners: A Case Study of the University of London International Programmes. The Journal of Academic Librarianship, 41(3), 312-321. https://doi.org/10.1016/j.acalib.2015.03.008
  • USA, W S H B C C C N P. (2016, July 15). The information-seeking behavior of visual artists: a literature review. https://www.emerald.com/insight/content/doi/10.1108/00220410810867579/full/html
  • W, M P C. (2001, July 1). Extending the technology acceptance model. https://dl.acm.org/doi/10.1145/506724.506730
  • Wiley, J., Goldman, S R., Graesser, A C., Sanchez, C A., Ash, I K., & Hemmerich, J. (2009, December 1). Source Evaluation, Comprehension, and Learning in Internet Science Inquiry Tasks. American Educational Research Journal, 46(4), 1060-1106. https://doi.org/10.3102/0002831209333183

    All very promising, but by 12:31, Jenni AI was not exporting the document in latex. Need to restart the browser, login, and try again. Tried to use their support but not luck.

    After lunch I realized that Jenni.AI was not able to export the literature review to Latex/biblatex if unknown authors existed (i.e. N.D. ).  After identifying the authors of all my references list entries. I could finally run latex.

Lesson learned 3) Jenni.AI is not very stable yet. It’s experimental. However it is a very promising tool to help crafting literature reviews.

I became curious on how plagiarism and AI detection tools report on the literature review exported from Jenni.AI. Went then to https://www.scribbr.com/ai-detector/, uploaded my pdf file and got “0% chance of being generate with AI” and high-risk of plagiarism because Jenni.AI used the same wording as in the source.  But in a closer look, the quotes are there and the citation as well. I don’t think is plagiarism. Anyway,  all this need to be re-writen by myself to not lose voice to AI.

Lesson learned 4) I must re-write everything a system like Jenni.AI writes for me to not lose voice to AI and to pass by the plagiarism checkers.

Here I must not that some plagiarism checkers  are often so sensitive that claim I am plagiarizing myself even when honestly writing about the same topic without copying and pasting.

Time will tell if I will continue using and paying for the tool

If you are interested you can see the compare the literature review exported with Jenni.AI with the current version re-writen and complemented over the weekend, I share them with the world on my research website hosted at Åbo Akademi.

END

 

Leave a Reply

Your email address will not be published. Required fields are marked *