Through the Technology Testing for Adult Learning and Employment (TTALE) grants from Walmart and the Walmart Foundation, the EdTech Center at World Education sought to better understand how technology tools could close the employment opportunity gap for lower-skilled, low-income adult workers and jobseekers.
In partnership with the Employment Technology Fund (ETF), we field tested seven digital tools that help individuals to enhance their skills and achieve greater workforce mobility across four segments:
- Learning and Training
- Mentoring and Support
- Job Search and Placement
- Assessment and Matching
For each of the seven tools, we worked with the technology developers to identify an implementation partner and site—with the intention of looking at tool usage in geographically diverse areas and settings.
Together with the technology developer and their partners—usually an adult education provider, workforce development agency, nonprofit, or employer—we crafted an initial field test plan, shaped by shared goals, to help us do two things:
- Arrive at recommendations for software shifts required to support end-user engagement and persistence
- Define characteristics of each setting to inform use scenarios
Iteration was a defining characteristic of the TTALE field-testing process, both within the testing of an individual tool and across the testing of all tools. Because the tools were made ready for testing in spaced sequence, as each tool was deployed, we crafted a field-testing plan and process for implementation that benefited from the experience of testing previous tools.
Between January 2018 and February 2019, the EdTech Center research team conducted interviews with individuals across all of the sectors that support these working learners and job seekers, including the learners themselves, technology developers, educators of adults, workforce development professionals, nonprofit partners, and employers.
For each of the specific tools and implementation sites, we conducted site visits to gather data from a wide variety of sources, including the following:
- Engagement data and analytics reported within the technology tool itself
- Interviews with learners and job seekers
- Interviews with key staff at the implementation sites who either used the tool or mediated use of the tool for learner end users
- Resources and media from the field-test setting
We also used each of the tools ourselves to better understand how features played out as either barriers or affordances within the particular use scenario. As we explored each tool and interviewed the stakeholders using it, we evaluated several criteria:
- Ease of Use and Navigation
- Features and Design
- Technical Aspects
- Data Security
We created a rubric adapted from widely used and research-based evaluation tools, including National Standards for Quality Online Courses, the Lea(R)n Grading Protocol, Quality Matters Rubric Standards, and the FLOE Inclusive Learning Design Handbook. Access the rubric below or download the PDF version here.
Because each tool and implementation site was unique, we did not design one data collection tool to use across the board. Instead, we designed customized data collection processes and resources to meet the intent of field testing at each site to minimize disruption to the implementation site and the adults it served.
Following our site visits, tool reviews, and interviews, we triangulated the data from these multiple sources, cross-verifying our findings from two or more sources.
Following analysis, we shared what we had learned and, as desired and needed, suggested development shifts to help each tool better support working learners. We also identified key learnings and effective practices to be more widely adopted as new technologies emerge to support adult learning and employment.
Components for Evaluation of Tech Tools
Navigation throughout the resource is logical, consistent, efficient, and predictable; Aptly uses text style, color, graphics, and icons to guide the learner; Resource includes a tutorial to help learners get started and provides access to support materials while underway.
Resource design reflects understanding of learners’ needs and uses varied ways, including interaction, to engage learners and help them to learn and accomplish objectives; Employs relevant and easy to use media; Organized by logically sequenced units.
The course content is accurate, and assignments are of sufficient rigor, depth, and breadth to teach the standards; Resource reflects appropriate reading level; Reflects cultural diversity and is bias-free; Free of adult content and distracting advertisements.
Resource assessments measure stated objectives and are consistent across units; Includes adequate opportunities to assess and report learner mastery and progress; Includes opportunities for learner self-assessment.
Stated objectives/competencies are aligned with relevant state or national standards and are appropriate for the target learner audience; Learning objectives/competencies are measurable; Relationship between learning objectives and activities is evident or stated.
Resource promotes the achievement of the stated learning objectives/competencies; Includes activities that engage learners in active learning and/or provide them with multiple learning paths or alternative activities based on their needs.
Resource design and features facilitate readability for a broad group of learners; Resource is developed with universal design principles and accessibility guidelines in mind.
Resource functions as intended; All technology requirements (including hardware, browser, software, etc.) are specified; Prerequisite skills in the use of technology are identified.