PELE2026: Seventh Annual Workshop on A/B Testing and Platform-Enabled Learning Engineering Seoul, South Korea, June 27, 2026 |
| Conference website | https://sites.google.com/carnegielearning.com/pele-2026/home |
| Submission link | https://easychair.org/conferences/?conf=pele2026 |
| Abstract registration deadline | May 4, 2026 |
| Submission deadline | May 4, 2026 |
Seventh Annual Workshop on A/B Testing and Platform-Enabled Learning Engineering (PELE)
We’re delighted to be hosting the Seventh Annual Workshop on A/B Testing and Platform-Enabled Learning Engineering (PELE) at the Festival of Learning 2026 in Seoul, South Korea. All members of the community are invited to attend the workshop and/or submit papers or demos.
The deadline to submit is May 4th, 2026.
An optional Early Decision deadline of April 24 is available for those who wish to submit for acceptance before the early-bird registration deadline (May 3, 2026).
Call For Papers
“There is no simple path that will take us immediately from the contemporary amateurism of the college to the professional design of learning environments and learning experiences. The most important step is to find a place on campus for a team of individuals who are professionals in the design of learning environments - learning engineers, if you will.” [1] - Herbert Simon
Learning engineering adds tools and processes to learning platforms to support improvement research [2]. One kind of tool is A/B testing [3], which is common in large software companies and also represented academically at conferences like the Annual Conference on Digital Experimentation (CODE). A number of A/B testing systems focused on educational applications have arisen recently, including UpGrade [4] and E-TRIALS [5]. A/B testing can be part of the puzzle of how to improve educational platforms, and yet challenging issues in education go beyond the generic paradigm. For example, the importance of teachers and instructors to learning means that students are not only connecting with software as individuals, but also as part of a shared classroom experience. Further, learning in topics like mathematics can be highly dependent on prior learning, and thus A or B may not be better overall, but only in interaction with prior knowledge [6]. In response, a set of learning platforms is opening their systems to improvement research by instructors and/or third-party researchers, with specific supports necessary for education-specific research designs. This workshop will explore how A/B testing in educational contexts is different, how learning platforms are opening up new possibilities, and how these empirical approaches can be used to drive powerful gains in student learning. It will also discuss forthcoming opportunities for funding to conduct platform-enabled learning research.
This will be a half-day workshop. The workshop will be devoted to presentations and discussions of accepted papers. We will organize presentations according to major themes (e.g., “adaptive algorithms”, “school communication strategies”), with the expectation that we will have 2-3 themes addressed during the workshop.
We will accept both papers (5-10 pages) and short submissions for papers or demos (up to 4 pages). Accepted presenters for papers will have 25 minutes to present, followed by 5 minutes for questions. Accepted short submissions will present their work as shorter talks or demo sessions following the long presentations.
Submission Guidelines
The deadline to submit is May 4th, 2026.
An optional Early Decision deadline of April 24 is available for those who wish to submit for acceptance before the early-bird registration deadline (May 3, 2026).
For 2026, we are inviting two submission types:
-
Papers: 5-10 page PDFs in CHI / ACM format (Word, LaTeX, Overleaf). References are not included in page limit.
-
Work-in-Progress & Demos: up to 4 page PDFs in CHI / ACM format (Word, LaTeX, Overleaf). References are not included in page limit.
List of Topics
Papers and demos may address issues with conducting A/B testing and learning engineering platforms, including topics such as:
- The role of A/B testing systems in complying with SEER principles (https://ies.ed.gov/seer/), which set a high bar for the goals of empirical studies of educational improvement
- Awareness of opportunities to use existing learning platforms to conduct research (http://seernet.org)
- Using adaptive experimentation methods (e.g., multi-armed bandits) in education research
- Managing unit of assignment issues, as arise when students are in classrooms with a shared teacher
- Practical considerations related to experimenting in school settings, MOOCs, & other contexts
- A/B testing within adaptive learning software
- Ethical, data security, and privacy issues
- Relating experimental results to learning-science principles
- Understanding use cases (core, supplemental, in-school, out-of-school, etc.)
- Accounting for interactions between the intended contrast (A vs. B) and learners’ prior knowledge, aptitudes, background or other important variables
- Attrition and dropout
- Stopping criteria
- User experience issues
- Educator involvement and public perceptions of experimentation
- Balancing practical improvements with open and generalizable science
Organizing Committee
- Steve Ritter, Carnegie Learning
- Stephen Fancsali, Carnegie Learning
- April Murphy, Carnegie Learning
- Neil Heffernan, Worcester Polytechnic Institute
- Joseph Jay Williams, University of Toronto
- Jeremy Roschelle, Digital Promise
- Danielle McNamara, Arizona State University
- Debshila Basu Mallick, Rice University
- John Stamper, Carnegie Mellon University
- Norman Bier, Carnegie Mellon University
- Jeff Carver, University of Alabama
Venue
The workshop will be held on July 27, 2026, 12pm to 5pm (local time, KST, GMT+9) as part of Festival of Learning 2026, at COEX Convention & Exhibition Center, 513, Yeongdong-daero, Gangnam-gu, Seoul 06164, Republic of Korea.
Contact
All questions about submissions should be emailed to April Murphy at amurphy@carnegielearning.com or Stephen Fancsali at sfancsali@carnegielearning.com.
