10/15/2025 Ashley Sims
Professor Abigail Wooldridge of the Department of Industrial and Enterprise Systems Engineering applies human factors and systems engineering to redesign healthcare around the people who make it work. Her research spans AI, surgery, and maternal health, creating safer, smarter systems that improve both clinician performance and patient outcomes.
Written by Ashley Sims
How ISE’s Professor Abigail Wooldridge is redesigning health care around the people who make it work.
By the time a postpartum hemorrhage becomes visible, it may already be too late. But what if the warning signs were buried not in biology, but in the system itself?
Poor lighting. Inaccessible supplies. Confused team roles. A medical record that doesn’t flag risks clearly enough.
To Professor Abigail Wooldridge, these aren’t just problems. They’re symptoms of a deeper issue: systems that don’t support the humans inside them.
As a professor in the Department of Industrial and Enterprise Systems Engineering (ISE) in the Grainger College of Engineering at the University of Illinois Urbana-Champaign, Wooldridge is reshaping some of the most critical parts of modern medicine. Her work blends cognitive science, human factors and systems engineering to reimagine health care not as a collection of tools but as a network of people, processes and interactions that must be designed with care.
In a series of recent publications, Wooldridge has emerged as a national voice on a topic that’s long been overlooked: how to build health care systems that are as humane as they are high-tech.
From Algorithms to Operating Rooms
Wooldridge’s recent research spans multiple domains, from AI to obstetrics to surgery, but it is all driven by the same question: What happens when we build health care systems without considering how people think, work and make decisions under pressure?
In a 2025 article published in the Journal of Medical Internet Research, she and her co-authors examined why so many promising AI tools fail to gain traction in clinical environments. After analyzing 38 studies on AI in medical imaging, they uncovered 12 recurring implementation barriers, including everything from workflow mismatches and lack of training to mistrust and medicolegal ambiguity. Notably, high algorithmic accuracy was not a predictor of adoption. Instead, tools thrived when:
- They fit seamlessly into existing workflows.
- They were trusted through transparency and explainability.
- They minimized extra clicks, delays or disruptions.
“There’s a misconception that if the model is accurate, clinicians will adopt it,” Wooldridge says. “But people don’t work in labs. They work in complex environments where trust, timing and usability all matter.”
The paper urges a new sociotechnical paradigm: designing AI systems not just for performance, but for fit with people, workflows and existing responsibilities.
When Design Fails, Safety Suffers
Another thread in Wooldridge’s research tackles the toll of flawed design in high-stakes clinical settings like the operating room. In a perspective paper co-authored with experts from Mayo Clinic and the Society of Surgical Ergonomics (The American Journal of Surgery), she synthesizes research in cognitive psychology, ergonomics and surgical safety to argue for a greater emphasis on cognitive ergonomics in the operating room.
Unlike physical ergonomics, which addresses posture, movement and equipment design, cognitive ergonomics focuses on the mental workload, team dynamics, situational awareness, communication under stress and decision-making that affect performance. In the operating room, even small distractions or ambiguities can trigger a cascade of failures.
Wooldridge and her collaborators reviewed both qualitative and quantitative evidence, including:
- Studies of mental workload measurement in surgical teams (e.g., NASA-TLX rating scales, EEG monitoring)
- Observational research on team communication patterns under stress
- Data on error rates linked to cognitive overload
The authors highlight how interruptions, unclear role assignment and alarm fatigue can degrade performance, at times more than physical strain. Their findings call for closer partnerships between clinicians and engineers to design systems that minimize overload and optimize team performance. They propose strategies like role clarity protocols, optimized alarm hierarchies and structured communication tools.
“Surgeons don’t work in isolation,” Wooldridge explains. “Their performance is shaped by an ecosystem of team roles, communication channels and environmental factors. We can design that ecosystem to support them.”
Rethinking Maternal Health from the System Up
Perhaps the most comprehensive example of Wooldridge’s systems approach comes from her work on postpartum hemorrhage (PPH), a leading cause of maternal mortality that persists despite available interventions.
In a recent study published in the International Journal for Quality in Health Care, Wooldridge’s team observed over 37 hours of real clinical activity and interviewed 29 health care providers involved in labor and delivery. Using the SEIPS (Systems Engineering Initiative for Patient Safety) framework, they mapped 753 factors that either enabled or impeded safe care.
What they found was eye-opening. Even when everyone had the right training and intentions, the system often worked against them. Critical roles weren’t accurately assigned. Tools were out of reach. Electronic records didn’t support early warning. And environmental factors, such as poor lighting, made it harder to detect bleeding early.
The team proposed four major redesigns:
- Assign dedicated coordination roles during high-risk deliveries
- Integrate point-of-care decision tools with real-time data
- Optimize room layout and lighting for rapid detection and intervention
- Redesign risk assessment tools for use under high-pressure conditions
A Systems Leader in Human-Centered Health Care
What ties all this work together is Wooldridge’s deep belief that health care safety isn’t just a matter of skill or technology–it’s a matter of system design.
That belief has positioned her as a national leader in applying industrial and systems engineering to human-centered health care challenges. She’s worked alongside clinicians at Mayo Clinic, Carle Foundation Hospital, and other major institutions, published in top journals across engineering and medicine, and trained students to think critically about how systems shape lives.
She’s also part of a growing movement within the Grainger College of Engineering to embed empathy, equity and usability into engineering research from education to health care to AI ethics.
The Human Impact
In a field often dominated by metrics and machinery, Wooldridge’s work re-centers health care around the people and systems that power it: the surgeons trying to concentrate, nurses navigating emergencies, clinicians deciding whether to trust an AI alert.
Her research reminds us that the biggest breakthroughs in medicine might not come from new drugs or devices, but from better-designed systems that support the work already being done. Because ultimately, the future of health care isn’t just high-tech. It’s human-engineered.
Abigail Wooldridge is an Assistant Professor in the Department of Industrial and Enterprise Systems Engineering in the Grainger College of Engineering. Her interests include patient journey and safety, care transitions and team cognition research. Learn more about Wooldridge's work by visiting Human Factors in Sociotechnical Systems Laboratory.