Karen Caldwell

AI PRODUCT CASE STUDY
VOLUNTEER SCHEDULE ASSISTANT
PROJECT OVERVIEW
Since 2016, I’ve been collaborating with the Karen Beasley Sea Turtle Rescue & Rehabilitation Hospital to modernize the manual, paper-based operations of their Volunteer Nest Tracking Program on Topsail Island, North Carolina. This initiative is supported by over 300 volunteers who dedicate more than 57,000 hours annually to patrol the island’s 26-mile coastline, monitoring and managing sea turtle nests.
Historically, the program relied on physical registration forms and printed Google Docs, which made volunteer scheduling time-consuming and error-prone. In response to this challenge, I led the UX strategy and design for a feature called the Volunteer Schedule Assistant — a tool designed to streamline the process of assigning volunteers to weekly beach patrol shifts.

The Volunteer Schedule Assistant was integrated with both a mobile registration flow and a desktop-based administrative dashboard. Volunteers scanned a QR code to register via mobile, and their submitted information (such as GPS location, patrol day preferences, and years of experience) was analyzed to generate intelligent weekly shift assignments. These assignments were then surfaced to program administrators for review and approval on the desktop app.
PROBLEM STATEMENT
The Nest Tracking Program’s volunteer scheduling process was historically manual, taking up to two weeks each season to assign volunteers to beach patrol shifts.
One of the most significant pain points occurred when volunteers couldn’t complete their assigned walk due to illness, travel, or emergencies. Since there was no centralized or digital method for managing substitutions, coordinators had to manually sift through printed paperwork to identify potential backups—an inefficient and error-prone process that left beach segments vulnerable to missed patrols.
Although most volunteers selected appropriate areas and availability, the overall scheduling logistics—balancing experience levels, beach segment needs, and weekly availability—were too complex to manage efficiently without automation. The existing tools lacked the intelligence and flexibility needed to support real-time adjustments or backup planning.

USERS & STAKEHOLDERS

Manual Scheduling Was Time-Consuming and Error-Prone
-
Scheduling over 300 volunteers across 26 beach segments took up to 2 weeks using paper forms and printed Google Docs.
​
-
Coordinators had to manually cross-reference availability, experience, and location — often resulting in coverage gaps and late substitutions.
​
-
If a volunteer couldn’t walk their segment, staff had to dig through paperwork to find a backup — wasting critical time during nesting season.
RESEARCH & INSIGHTS
Impact
-
Cut volunteer scheduling time by 90%, transforming a 2-week manual process into minutes for 300+ volunteers.
-
Spearheaded the shift from paper notebooks to a mobile data platform, increasing data accuracy by 85% and enabling real-time nest tracking and visibility across the entire volunteer network.
What’s Next
-
Stakeholders are excited to implement this feature for the 2026 nesting season.
-
Full rollout is pending development funding.
-
Begin research on Dashboard requirements.
Key Learnings
-
Designing for non-technical users reinforced the importance of clarity and simplicity.
-
Prototyping AI logic helped me better communicate ideas to future developers.
-
Smart automation works best when paired with manual controls for real-world flexibility.
FEATURE DEFINITION
Smart Scheduling Logic
-
Experience Level – Priority goes to returning volunteers.
-
GPS Location – Local volunteers are assigned closer to home.
-
Availability – Preferred days are factored into the schedule.
​
How It Works
-
Volunteers register via mobile and receive assignments.
-
They confirm or contact coordinators if changes are needed.
-
Coordinators can review, override, or adjust the schedule before it’s finalized.
Backup Planning
If a volunteer cancels:
-
The system suggests top backup candidates based on availability and proximity.
-
Coordinators receive a notification and can make fast, informed substitutions.
​
Design Process & Implementation
Design Process
​
-
Used ChatGPT to explore edge cases and user needs.
-
Created wireframes using v0.dev and built a working prototype in Figma.
-
Simulated back-end logic in Google Colab to model AI scheduling behavior.
User Testing
​
-
Conducted moderated tests with coordinators using the Figma prototype.
-
Focused on core tasks like reviewing, confirming, and adjusting schedules.
-
Feedback confirmed the design was intuitive — even for non-tech-savvy users.
Accessibility & Simplicity
​
-
Designed with minimal interfaces and clear labels.
-
Prioritized easy navigation to support users with varying tech experience.

OUTCOME
REFLECTION
-
Designing this assistant helped me explore how AI can support real-world scheduling challenges in a volunteer-driven program.
-
I learned the value of balancing automation with manual control, especially for non-technical users.
-
Building for simplicity and clarity was key — even powerful tools need to feel approachable.
-
This project deepened my ability to prototype smart systems, even before full development resources are in place.

TOOLS & METHODS
ChatGPT
v0.dev
Google Colab
Figma