This AI Literacy Review covers numerous calls for AI literacy programs, AI literacy among US academic librarians, a museum exhibit or escape room to teach AI concepts, AI and writing, equity issues in AI usage for college admissions, and more.

General

At the Global IndiaAI Summit The Future of Work: Ensuring AI Literacy for All, Dr B. Shadrach discussed the impact of AI on millions of workers worldwide and suggested that a generic learning program could help informal sector workers adapt to AI. He discussed a mobile phone-based curriculum in vernacular languages that is being tested among 20,000 farmers. 

Jeanne Beatrix Law announces a new Coursera course from Kennesaw State University called AI for Grant Writing that teaches people how to use generative AI for grant writing. 

Datacamp’s survey of 550 business leaders titled The State of Data and AI Literacy 2024 finds that 62% believe their organization has an AI literacy skill gap, and 70% see a basic understanding of AI concepts as being important AI skills for employees for day-to-day tasks.  

Kimberly Pace Becker discusses how using AI as a collaborative tool for writing requires going back to the basic principle of effective communication, being audience-centered, and learning the skill of how to communicate with chatbots. (see Kimberly Pace Becker’s LinkedIn post)

Nicole Leffer posts a reminder that you can master AI skills with a few core tools and don’t need to chase after all of the new AI tools being released. She recommends picking a widely used tool and focusing on learning about its capabilities, then finding complimentary tools to learn about if needed. (see Nicole Leffer’s LinkedIn post)

Sanjay K. Mohindroo reiterates that AI adoption in an organization has to go beyond understanding the technology; staff have to know how to effectively apply it to workflows and feel comfortable experimenting and learning from failure. (see Sanjay K. Mohindroo’s LinkedIn article)

EqualAI CEO Miriam Vogel says AI literacy programs are essential to get people to engage with the technology and means critical thinking and ensuring kids are AI-ready. (see Megan Morrone’s Axios article How fear of AI could limit its benefits to a few)

Libraries and Museums

A new study of AI literacy among US academic library staff by Leo S. Lo titled Evaluating AI Literacy in Academic Libraries shows that most had a moderate level of experience with AI tools and the majority didn’t use these tools, though 44% thought it was extremely important that academic librarians receive training on AI tools in the next 12 months. Some believed librarians should be teaching AI literacy to students and other library patrons. The study proposes 7 key competencies as a framework for AI literacy in academic libraries, including utilizing AI tools effectively and appropriately. 

Hasti Darabipourshiraz et al. develop a museum exhibit for middle-school-age children called DataBites that helps children learn about the AI concept of supervised machine learning through a pizza/sandwich activity. (read the paper DataBites: An embodied, co-creative museum exhibit to foster children’s understanding of supervised machine learning)

Education

The National Education Association approved a policy statement about AI and its impact on public schools, and stressed the need for students and educators to become AI literate, which means knowing the basics but also the uses of AI, the biases, and ethical considerations involved: “AI literacy should be developed across the curriculum, not just in computer science and related courses. Furthermore, all students should have access to a rich AI literacy curriculum, not just the most advantaged or most advanced.” (read the Report of the NEA Task Force on Artificial Intelligence in Education)

David Joyner’s article What does it mean for students to be AI-ready? in Times Higher Education says we owe it to students to prepare them with AI skills for the world they are headed into, and that conversations about AI literacy are too restricted to the extremes. 

Dan Fitzpatrick’s article ChatGPT Forces Universities to Adapt Or Retreat in Forbes examines a study conducted at the University of Reading (UK) showing AI-generated submissions achieved higher grades and mostly went undetected by markers, with the response being to return to invigilated, in-person exams. Fitzpatrick questions whether this approach will prepare students for an AI-ubiquitous world, and sees the challenge for educational institutions being how to teach students to “effectively and ethically use AI tools, while also honing the critical thinking and creative skills that remain uniquely human.”

Danny Scuderi believes that the Humanities discipline has an advantage when it comes to the skills needed for prompt engineering, because they both prioritize accurate, concise writing for a specific purpose and audience. (see Danny Scuderi’s LinkedIn post)

In his article When AI Triggers Our Imposter Syndrome, Marc Watkins writes about imposter syndrome in terms of talking about generative AI as a non-technical, non-computer science expert to students, educators, and others. He discusses how the willingness to be imperfect, to fail, to get things wrong, will actually help because the generative AI field is rapidly evolving anyway. He reminds us that writing and learning existed long before generative AI, and that there is the opportunity to join with students to grapple with the implications of this new tech. 

Tim Dasey points out the need for people who aren’t AI developers but who can translate between the tech and education worlds and help educators understand the pros and cons of AI tech, define requirements, and describe desired AI behavior for learning. Otherwise, it’s edtech at the wheel. (see Tim Dasey’s LinkedIn post)

Cheryl Tice writes about how students are looking to instructors to be the guides and mentors to help them develop AI literacy skills, and instructors should encourage strategic and ethical use, have open discussions, and ask students to share their AI chats. This avoids driving student use of AI underground. Yes! (see Cheryl Tice’s LinkedIn post)

There were several sessions covering AI literacy at the Teaching and Learning with AI 2024 symposium. Emily Rush presented on AI Literacy: Fostering an Intertwined Relationship between Pedagogy and Technology in Higher Education, exploring how AI literacy is crucial to ensuring faculty and students can learn about AI-powered tools and make informed decisions about how to use them in educational contexts. Sierra Adare-Tasiwoopa ápi presented on Gamify Generative AI as a Way to Teach AI Literacy, offering an innovative way to create escape room challenges to teach people how to critically engage with AI-generated content. 

Anna Mills presented a session at MYFest24 titled Writing Feedback: Supporting a Human-Centered Writing Process and Building AI Literacy, covering how students can build critical AI literacy by seeing the feedback from AI systems and when the systems get it wrong or misaligned. Mills also touched on how to mitigate AI systems reinforcing the dominance of Standard English. (see Anna Mills’ LinkedIn post)

Foundry10’s research by Jennifer Rubin et al. on students’ use of AI tools for help with college applications found that 30% were using gen. AI tools to assist with writing personal essays, and their top reasons were to improve the quality, perform grammar and spellchecks, generate ideas and content, save time, and reduce stress. Students who weren’t using it for college application essays had a preference for human work, saw ethical implications and accuracy concerns with using AI tools, or weren’t aware or familiar with using AI tools. 

The study also uncovered some equity issues. There was a notable income disparity, with higher-income students more likely to use it than lower-income students (40% vs. 22%). And in an experimental part of the study, it turned out that applicants were rated as less ethical, less competent, and less likable if the person rating their essay believed that ChatGPT had been used (as opposed to a college admissions coach or no help at all). For students who can’t access personal coaching assistance or struggle with writing skills, this presents a potential obstacle to using generative AI tools if they carry a stigma. (read foundry10’s Navigating College Applications with AI white paper)

Rebecca Bultsma recommends that schools develop clear AI use policies and overcommunicate them to teachers and students, and educate everyone on ethical AI use. This would help avoid the situation of an increasing number of students feeling unfairly punished for cheating, even when using tools such as Grammarly that have previously been promoted for educational purposes. (see Rebecca Bultsma’s LinkedIn post)

A journal article by Yoshija Walter titled Embracing the future of Artificial Intelligence in the classroom: the relevance of AI literacy, prompt engineering, and critical thinking in modern education reviews the need for AI literacy and prompt engineering skills through the case study of a Swiss university.

Categories: News