AI-powered learning platforms are spreading but there is a lot of confusion and hype in the market.
To build public confidence in the educational potential of AI, the industry needs to adopt common benchmarks and standards to ensure safe and responsible use of AI in education.
Today, users – parents, students, educators – are unaware of whether a device is safe and effective and fear most AI-enabled devices.
To address this, Riiid, a leader in AI-powered education solutions and DXtera Institute, a non-profit membership organization that uses technology to reduce barriers to education delivery, has formed a cross-sector alliance of companies, non-profit organizations and education benchmarks. Education Technology Association to work on AI in the initiative.
Launched in August, the initiative focuses on establishing benchmarks and standards in four key categories – security (security, privacy), accountability (defining stakeholder responsibilities), fairness (equality, ethics and lack of bias), and effectiveness (improving the scale. Learning outcomes). ). In a word, safe educational A.I.
DXtera, a trusted non-profit player, manages the Alliance’s day-to-day operations. They will be financial agents and contract agents for hiring staff and experts. Through membership payments, sponsorships and philanthropic support, the Alliance intends to become self-supporting in the long run. Riiid, which has funded the foundation of the initiative, is playing an active role in recruiting new members from outside organizations.
In the three months since the alliance began, there have been more than 100 members, out of 20 members representing fifteen countries. Participating organizations include Carnegie Learning, ETS, GSV Ventures, German Alliance for Education, EduCloud Alliance and Digital Promise. The Alliance has also aligned itself with UNESCO’s Broadband Commission for Sustainable Development, which aims to connect everyone in the world to the Internet.
The Alliance expects to eventually hire paid experts to develop standards that can be tested and certified. It will not work in a vacuum or develop standards from scratch.
Underwriters Laboratory, a private certification company, is a member of the alliance and has independently developed a type of rubric that they use to test algorithms. UL, as it is known today, has participated in the safety analysis of many new technologies since its inception in 1894.
Almost every American product that uses electricity has the UL logo on it, meaning it has been rigorously tested to meet a variety of standards.
The Alliance intends to do the same for AI education tools and platforms, eventually implementing an optional review process for products that allow consumers to trust the food products packaged today like nutrition labels.
Strict testing can also help determine whether products comply with existing data privacy laws, such as the General Data Protection Regulation guidelines in the European Union and data privacy laws in California.
The Alliance hopes that the school districts and other institutions will then use the Alliance Certificate to guide the procurement of their AI-enabled educational technology.
The alliance is not just focused on the US market. It is involved with people from Israel, Russia, and the EU EdTech Consortium, which represents all EU countries, and the Education Alliance Finland, among others. The German Alliance for Education brings to the table representatives from about 100 groups, from ministries and companies to universities and schools.
AI has the potential to transform learning, free teachers from administrative burdens, and personalize learning paths for students. But in order to realize that potential, we need to have standards that everyone can trust. We invite professionals from all walks of life in the education industry, from educational distribution agents, users and governments to participate.
What do we do
The latest EdTech news in your inbox