This AI Literacy Review looks at new pathways for thinking about AI tools, job impacts, lack of AI readiness, movements in governments, a student-led AI resource site, AI teaching assistant, the role of AI in writing, whether to disclose usage, AI’s role in scholarly processes, and a gender gap among under 15s in AI tool use.
General
Allie K. Miller launches the LinkedIn Learning course How to Use Generative AI: Building an AI-First Mindset and also shares her “5 Steps of AI-First Productivity” to show where most AI users are getting stuck with using these tools. (see Allie K. Miller’s LinkedIn post)
Ethan Mollick gives a warning about something that people aren’t using to considering in using Generative AI compared to traditional search. People are used to “false negatives (a document that is there wasn’t found)” but not “false positives (details are made up that aren’t there)”. He cautions those developing “talk-to-your document RAG systems with AI” that people are still used to the ways of search engines and may expect them to work the same way. (see Ethan Mollick’s LinkedIn post)
Darren Coxon in A Man Alone in a Room reflects on the biases at the core of AI due to its training on published data, which has tended to be written by certain kinds of men over the past few centuries.
The Al-Enabled ICT Workforce Consortium publishes its report on the impact of Gen. AI on ICT job roles, which includes training/reskilling recommendations for 47 jobs and the indication that 100% of jobs will require AI literacy skills.
Organizations
Microsoft’s Generative AI in Real-World Workplaces: The Second Microsoft Report on AI and Productivity Researchlooks at results from different studies of workers using Microsoft’s Copilot AI tools and finds that Generative AI is helping workers become more productive in their jobs, but that its influence varies based on role, function, and organization and how much people adopt and use the tools.
On Jordan Wilson’s Everyday AI podcast episode 328: AI Can (Almost) Do Your CEO’s Job, Greg Shove discusses a recent survey of knowledge workers showing only 8% were considered AI-ready. Shove recommends that companies boost AI adoption by investing in comprehensive training on effective prompting and use cases. (see the Everyday AI newsletter summary)
Aneesh Raman notes that only 39% of knowledge workers have been trained on AI at work even though 75% say they’re using it, and recommends building learning into the day-to-day of company culture. (see Aneesh Raman’s LinkedIn post)
Government
With the passing of the EU AI Act, legal teams are pushing out information about what the requirements mean relating to AI literacy in organizations. For example, William Fry’s The Time to (AI) Act is Now: A Practical Guide to AI Literacy Requirements under the AI Act gives a succinct overview of requirements, key dates, enforcement, and steps to compliance which include assessing current AI literacy levels and implementing programs tailored to different roles within organizations.
U.S. Senators Mark Kelly and Mike Rounds introduce the Consumer Literacy and Empowerment to Advance Responsible Navigation of Artificial Intelligence (Consumers LEARN AI) Act, to move toward consumer awareness and confidence in using AI products and services. The Act received support from organizations such as the AARP and AIandYou. (read Senator Kelly’s press release)
Meanwhile, U.S. Representatives Gabe Amo and Tom Kean introduce the Literacy in Future Technologies (LIFT) Artificial Intelligence Act relating to the development of AI literacy curriculum in schools for students in grades K through 12, including evaluation tools to assess proficiency. Supporters include the STEM Education Coalition. (read Representative Amo’s press release)
Amy Jones in Driving agency AI literacy utilizing guardrails and frameworks gives government agencies strategic recommendations for AI literacy, including developing systematic programs to educate the workforce on the risks and nuances of Gen. AI, and encouraging a culture of security literacy so employees can better discern AI-generated content and look out for manipulation.
Education
Danny Liu posts about students at the University of Sydney working with faculty to expand the open-access AI in Education resource site with guidance and examples of students using Generative AI as an assistant instead of a replacement. The Canvas LMS site covers what is Gen. AI, using it responsibly, how to use it, applying it in different disciplines, and using it for different assessment activities. (see Danny Liu’s LinkedIn post)
The World Economic Forum’s Shaping the Future of Learning: The Role of AI in Education 4.0 Insight Reportconcludes that “Students, teachers and administrators must receive necessary training and upskilling opportunities oriented to their needs to help them make the most productive use of AI systems” and “Equity and inclusion considerations must be central to the design of programmes to ensure that AI literacy is widely imparted and the benefits of AI technologies in education accrue widely.”
William & Mary University researchers publish a study Evaluating the Effectiveness of LLMs in Introductory Computer Science Education: A Semester-Long Field Study on how an AI teaching assistant called CodeTutor affected students’ learning outcomes in a computer programming course. They found that 63% of student prompts were unsatisfactory, showing a lack of AI literacy. The next step is to collaborate with local schools to design AI literacy camps for younger students. (read the William & Mary press release)
University of Florida’s article UF writing courses redefine the role of AI by Lauren Barnett discusses how assistant instructional professor in writing Zea Miller prepares students with a range of AI skills through his course Professional Writing in AI, which approaches AI from the steps of a possible career and includes tasks like client communication and pitch decks.
In AI’s Potential in Special Education: What Teachers and Parents Think in Education Week, Lauraine Langreo covers how educators and parents of students with disabilities are both optimistic and cautious about AI use in schools, and how AI literacy plays a role for students as well as parents and educators.
Anna Mills asks whether instructors should disclose their AI usage and provides a draft of her attestation statement about her AI usage. (see Anna Mills’ LinkedIn post)
Chris Basgier in Speedy Research at the Expense of Sense? The Pedagogy of AI Scholarly Search raises the issue of what skills might be lost with new AI research tools that aim to speed up the scholarly literature review process, advising that there is still a need for traditional research practices and deep immersion in a field to be able to critically examine the results of these new tools.
Ray Schroeder in Our Responsibility to Teach AI to Students for Insider Higher Ed writes that it’s critical that students gain Gen. AI skills, especially those who are getting ready to enter the workforce soon.
In the continuing debate about whether or not to ‘cite AI’ in an educational context, Mike Kentz discusses how working with Gen. AI systems is an art not a science and requires Humanities-based skills, and that trying to cite AI or share a prompt with a teacher doesn’t tell the whole story of the student’s learning. (see Mike Kentz’s LinkedIn post)
A small online survey of parents found a gender gap in Generation Alpha children (ages 7 to 14) using AI tools for fun, to learn new things, or for homework help, with 54% of boys using it vs. 45% of girls.