This AI Literacy Review covers a creative squirrel analogy for Generative AI, low detectability of AI-generated text, responses to Ted Chiang’s critique of AI in The New Yorker, gender and age gaps in AI usage and training in organizations, AI literacy for librarians and healthcare workforce, resources and books for educators, AI in instructional design, and more.
General
Carlo Iacono offers a creative analogy for how to think about Generative AI and the importance of prompting: imagine a squirrel climbing a tree, and the branches are related to the AI’s training, and a vague prompt is like blindfolding the squirrel. Check out the full analogy and picture of the “AI Tree of Possibilities” on Iacono’s LinkedIn post.
Trying to decipher the difference between human-written and AI-generated text is potentially going to become an advanced skill. It’s getting more difficult, as shown in Can linguists distinguish between ChatGPT/AI and human writing? a study of 72 editors at top applied linguistics journals. The overall positive identification rate was only 39%, and no editor was able to correctly identify all four of the research abstracts used in the test. A study in education Do teachers spot AI? Evaluating the detectability of AI-generated texts among student essays showed similar findings.
The U.S. National Science Foundation launches the EducateAI initiative with an investment of almost $8 million in five projects, including an integration of AI literacy into community college programs. It hopes to broaden access to AI research resources and help ensure a diverse AI-ready workforce.
In AI literacy and the GARTNER’s “Trough for Disillusionment, Luca Collina looks at how Gen. AI has moved beyond the hype and how companies might go through dips and peaks in implementing AI. Collina offers three critical elements for targeting AI capability, including cognitivist strategies that help learners connect new AI knowledge to existing business practices.
Jaspreet Bindra launches AI&Beyond to democratize AI literacy across industries in India and beyond by offering a suite of learning programmes to bridge the gap in how AI is being used across sectors.
In Why your AI outputs feel “average” Jon Ippolito reflects on writer Ted Chiang’s critique of AI in The New Yorker (Why A.I. Isn’t Going to Make Art) that talked about AI models dealing in averages. Ippolito discusses how averages are indeed at work and can lead to mediocre results, but AI can also be very versatile and yield impressive results. Jesse Damiani in What Ted Chiang’s New Yorker Essay Gets Wrong About Art and AI—and Why It Matters also responds to Chiang’s piece, concluding that it misrepresents both AI and art, and suggests that we instead advocate for artistic AI literacy among artists and the public.
Organizations
The AI Literacy for C-Suite Executives course offered by the Chulalongkorn School of Integrated Innovation in Bangkok, Thailand, aimed to give senior executives the kind of strategic knowledge they could use to apply to their organizations. The course included theories and practical applications, as well as a follow-up clinic for one-on-one consultations with AI experts.
In The real AI training gap? IT leaders believe in it, but many don’t provide it, Grant Gross reports on how 95% of IT leaders think AI projects will fail without staff using AI tools effectively, but there is a significant lack of training for employees.
Slingshot’s 2024 Digital Work Trends report of 253 U.S. employees and managers shows that only 23% of employees feel completely educated and trained on AI. There’s also a gender gap: 66% of men feel adequately trained, compared to 44% of women.
Slack’s Workforce Lab did a workforce index survey of over 10,000 desk workers across 6 countries that shows growing enthusiasm for AI, but also age and gender differences. 55% of workers 18-29 are excited for automation to handle parts of work, compared to 33% of workers over 60. 35% of men had tried AI for work, compared to 29% of women, but the gender gap was largest among people ages 18-29 at 25%.
Jay Tarzwell points out that companies without AI policies or with overly restrictive policies are leaving it up to employees to reap the benefits of secretly using AI to increase productivity. (see Jay Tarzwell’s LinkedIn post)
Libraries
The white paper Building an AI Literacy Framework: Perspectives from Instruction Librarians and Current Information Literacy Tools by Sandy Hervieux and Amanda Wheatley (McGill University) is published, with support from Choice (Association of College and Research Libraries), LibTech Insights, and Taylor and Francis. This white paper aims to help academic librarians navigate the place of AI literacy in information literacy instruction.
The non-profit organization EIFL (Electronic Information for Libraries) compiles a an AI training program outline with resources that librarians can use to guide AI education in their domain.
Healthcare
In AI for all, Michael Aubele reports on how Pitt Med aims to become a world leader in AI literacy. Its associate dean for AI in medicine, Hooman Rashidi, is rolling out an AI curriculum for medical students that includes lectures, a self-paced elective, and a more specialized elective that involves coding skills. He believes it’s important for students to understand AI and this will relate to improved patient outcomes and scientific discoveries.
In Excessive reliance on AI tools could hinder the development of interpersonal skills and empathy in Nursing Times, Polat Goktas, Aycan Kucukkaya, and Pelin Karacay from universities in Ireland and Turkey discuss the impact of ChatGPT on nursing education and how it might enhance students’ competencies in clinical settings. They also note the concern that nursing students might not develop the profession’s deeply-held values of interpersonal skills and empathy.
Education
The Digital Education Council’s 2024 Global AI Student Survey shows that 86% of students have already used AI in their studies, with 54% using it on a weekly basis and 25% on a daily basis. 69% of students who use it are using it to ‘search for information’ [although it isn’t actually searching for information, which indicates that students are using it like a search engine even though it isn’t]. Students also expressed a significant need for effective AI use cases and technology.
Pennsylvania State University has announced a new program called AI-Enhanced Pedagogy: Exploring Generative AI as a Collaborative Partner with seven activities or projects for participants to experiment with Gen. AI and support student use of it in learning activities.
Jack Dougall shares resources from a day of AI training at his school for other school leaders or teachers to benefit from. Resources include worksheets, a chatbot, booklets, and videos. (see Jack Dougall’s LinkedIn post)
Philippa Hardman writes that Arizona State University alongside its partnership with OpenAI has launched 200 projects relating to AI, mostly on enhancing teaching and learning. She points to emerging skills for instructional designers as they leverage this tech in their work and explores these further in her blog post AI-Powered Instructional Design at ASU. (see Philippa Hardman’s LinkedIn post)
In another blog post The Six AI Use Case Families of Instructional Design, Hardman offers six AI use case areas for instructional design plus a graphic table for easy reference. Going beyond the basics, the use case areas include creative ideation, data driven insights, adaptation and localisation, and intelligent assistance.
Joseph M. Moxley in Practice Critical AI Literacies describes an assignment that helps undergraduate students develop critical AI literacy, based on the Student Guide to AI Literacy developed by the Modern Language Association of America (MLA) and Conference on College Composition and Communication Joint Task Force on Writing and AI.
The open-access book Creative Applications of Artificial Intelligence in Education, edited by Alex Urmeneta and Margarida Romero, explores the synergy between AI and education and provides concrete examples of how educators in K-12 and higher ed can foster AI literacy along with creativity, critical thinking, and problem-solving.
Sue Thotz at Common Sense Education is presenting a Nearpod-sponsored webinar on September 18, 2024 titled AI Literacy: Supporting Educators in Teaching AI Digital Citizenship and Ethics, aimed at teachers, school and district leaders, and people in edtech.
In AI literacy a critical component in 21st-century learning, Hanelie Adendorff from the Centre for Teaching and Learning at Stellenbosch University in South Africa discusses what AI literacy is and why it matters, particularly in the context of higher education.
In How we’re improving AI literacy in young people, Google.org announces a $10 million commitment to teach and engage more young people in AI, directed at the Raspberry Pi Foundation to expand access to the educational program Experience AI.