Skip to main content

Design of the Data Collection System for the Ohio Assistive Technology Outcomes (OhioATO) Project

Dave Edyburn
Sally Fennema-Jansen
Roger O. Smith
Susan Wilson
Mary Binion

Updated: September 13, 2006

Executive Summary

The Ohio Assistive Technology Infusion Project (ATIP) delivered $9.4 million dollars of assistive technology to approximately 3,500 K-12 students during 2001-2003. This work generated what is believed to be the largest set of educational assistive outcomes data in the world. This report details the design of the data collection system created to support the administration and implementation of ATIP.

This report will provide readers with (a) an overview of the design and implementation of an electronic data collection system, (b) a model for other states, school districts, and agencies interested in replicating the procedures involved in large-scale data collection, and (c) an archive of historical documentation since multiple reorganizations of state level agencies have disbanded key project agencies (i.e., ORCLISH and the SchoolNet Commission). Future reports will be linked to this web site to detail the research outcomes of ATIP.

Acknowledgements

This project is funded in part by the U.S. Department of Education National Institute on Disability Related Research (NIDRR) under Grant # H133A010403 and the U.S. Department of Education School Renovation, IDEA, and Technology Grant # 84.352A. The opinions herein are those of the grantee and do not necessarily reflect those of the U.S. Department of Education.

Design of the Data Collection System of the Assistive Technology Infusion Project (ATIP)

Assistive Technology Funding Opportunity and Priority

Based on public hearings conducted through the Ohio Department of Education's State Improvement Grant and a study conducted by the Ohio Coalition for the Education of Children with Disabilities, funding for assistive technology devices and services was determined to be a high educational priority for the state of Ohio. In June of 2001, the Ohio Department of Education received a 36 million-dollar federal grant from the United States Department of Education School Renovation, IDEA, and Technology Grants, of which 9.4 million was allocated for use in assisting local school districts in providing assistive technology (AT) devices for students with disabilities. This component of the federal grant award was subsequently named the Assistive Technology Infusion Project (ATIP).

ATIP Start-up and Organization

Upon receiving notification of the federal grant award, the Ohio Department of Education Office for Exceptional Children (ODE-OEC) established a management and implementation partnership with ORCLISH* and the Ohio SchoolNet Commission*. [*Note: Multiple reorganizations of state level agencies have subsequently disbanded ORCLISH and the SchoolNet Commission.] In addition, an advisory board was established.

The collaborating agencies developed a process to disperse funding by which local educational agencies (LEAs) would have an opportunity to apply for funds for the purchase of specific assistive devices for individual students. The devices would be provided based on assessments conducted by local school personnel and according to the student's Individual Education Plan (IEP) or Individual Family Service Plan (IFSP). Districts would be responsible for assistive technology services (i.e., evaluation, training, repair, and maintenance) as part of their local contribution.

The design of the project sought to develop the assessment skills of local school teams by requiring them to complete specific steps of an assessment in order to apply for funding. The process also encouraged greater awareness at the district level of the need to carefully consider the provision of AT devices and services to students with disabilities. Given the size of the funding award, ATIP staff perceived the need to study the outcomes that result from the provision of assistive technology devices to individual students.

Collaboration With The ATOMS Project

To make the most of this opportunity, the ATIP staff collaborated with staff from the ATOMS (Assistive Technology Outcomes Measurement System) Project. The ATOMS Project was funded by the U.S. Department of Education National Institute on Disability Related Research (NIDRR) as a five year project to systematically explore, pilot, and test AT outcome measurement ideas in order to recommend the next generation outcome system for AT. The partnership between ATIP and ATOMS created a unique opportunity to study both quantitative and qualitative data related to the provision of AT in the schools.

Process and Procedures

Overview

An overview of the process designed by ATIP staff to infuse assistive technology into the Ohio K-12 public schools is illustrated in a flow chart (Appendix A) and a timeline (Appendix B). The entire process was designed to (a) promote best practices associated with the selection and use of AT, and (b) document efforts to support access to, and progress in, general education. Whereas the flow chart provides an overview of the general steps of the process, the following narrative describes the process in greater detail.

Information Dissemination: Request for Proposals

Informing schools about the opportunity to apply for funds for assistive technology through ATIP was a critical start-up activity that would facilitate the project's goal of meeting students' assistive technology needs. A variety of methods were used during the period (month/year to month/year) to disseminate information about the request for proposals (Appendix C, PDF) including a) electronic distribution lists to superintendents, technology coordinators, supervisors of special education programs, and principals; b) electronic monthly newsletters of the Ohio SchoolNet Commission; c) the print newsletter of the Ohio Department of Education Office for Exceptional Children; d) the print newsletter of the Ohio Coalition for the Education of Children with Disabilities; e) the print newsletters and listservs of Ohio's sixteen Special Education Regional Resource Centers; f) print brochures; g) four web sites; h) interactive video distance learning sessions; i) audioconference calls; j) web streaming vendor videos; and k) dissemination of information at state and regional conferences and meetings. The content of the announcements about the request for proposals included the following components: (a) the purpose and parameters of the project, (b) the availability of the online application form, (c) training opportunities, (d) technology resource support, (e) vendor support, (f) application evaluation rubrics, and (g) application deadlines.

Online Application

To apply for assistive technology funding through ATIP, school district personnel were required to complete an online application (Appendix D, PDF). Guidelines (Appendix E, PDF) were also available to guide the applicants through an assistive technology assessment by a series of the questions the team members were required to answer concerning: (a) Present levels of performance, (b) statement of critical need, (c) goal development, (d) solution generation, (e) feature match, (f) trial, (g) solution selection, (h) implementation, and (i) evaluation aligned with the steps of the IEP process.

Application Supports

For those who were new to assistive technology assessment, resources were provided to assist them in the process:

Deadlines

In order to manage the workflow (Appendix B) associated with receiving, reviewing, and awarding proposals, and to properly allocate the disbursement of funds, four rounds of funding were established. Deadlines were established at approximately six month intervals: November 2001, May 2002, November 2002, and February 2003. The information dissemination procedures outlined earlier were used to inform school districts of the availability of funding for each round.

Review of Applications

Processing. After the application deadline closed for each phase of funding, ATIP staff printed each application and made 3 copies in preparation for proposal review as part of a two-day state-wide grant application review. Applications were sorted by staff members according to area of need and types of technologies requested to ensure that the reviewers were knowledgeable about the area.

Selection of Reviewers. Reviewers were solicited from the field and included a vast array of professionals in both general and special education with varying levels of knowledge about assistive technology (e.g., some reviewers were well versed in specific types of AT such as hearing, mobility, or communication; while others were generalists and knew a great deal about a range of AT devices). Approximately 95 reviewers participated in the review process for each round of applications.

Reviewer Training. Each reviewer participated in a training session that provided an overview of the ATIP application process and an introduction to the various forms and tools that would be used in the process of reviewing each application. As part of the training, each reviewer scored a mock application (Appendix G, PDF). This application was subsequently analyzed by an independent researcher from Ohio State University to calibrate each reviewer's scoring. This technique provided a mechanism to eliminate reader bias and statistically adjust review scores in cases where the scores were inconsistent.

Review Activities. The review process was designed so that each grant was read by three people with varied backgrounds, with at least one person having expertise in the primary disability or area of need identified by the applicant (e.g., a speech therapist would review each grant related to a communication need or an audiologist would review a grant request for classroom amplification). With a large diverse pool of reviewers, the review process was organized to minimize the likelihood that the same three people would read the same proposals in order to increase the validity of the scores.

Reviewers evaluated each proposal using a scoring rubric depending on the level of funding requested: Level 1 (under $3,000) [Level 1 Scoring Rubric (Appendix H)], Level 2 (over $3,001) [Level 2 Scoring Rubric (Appendix I)]. Responses were recorded on a computerized answer sheet (bubble sheet). Each item of the rubric correlated with a section of the application. Free-form comments were also written by the reviewers.

For grants requesting over $3001, a consensus process was used. A facilitator with expertise in the student's primary disability area was assigned to read the grant along with two random readers. After the application was read and scored the facilitator led a discussion with the other grant readers to assure that there was agreement regarding the application scoring and the consensus comments which later would be provided to the applicants.

Cut-off Scores and Rankings. At the conclusion of the state-wide grant application review, the computerized scoring sheets for each proposal were submitted to a researcher at Ohio State University for statistical analysis. Several factors were considered at this point. Some items on the scoring rubric were weighted more heavily than others to reflect the importance of those items. In addition, districts were awarded points based on their level of financial need. The cutoff point for the awarding of the grants was determined through a statistical analysis to ensure the reliability of the scores, as well as equity in the distribution of the awards.

Notification

Two types of notification letters were sent to three individuals associated with each proposal: a building contact, a district contact, and the superintendent. One letter was an intent to fund letter. The other letter was a denial letter.

Districts that had applications that were not funded were sent a denial letter that included the consensus summary written by the review team regarding the application's strengths and weaknesses. These applicants were encouraged to follow-up with their local Special Education Regional Resource Center or Educational Service Center for technical assistance in re-submitting their applications in a later round. Applications that were not funded were denied for a variety of reasons, but two key factors included no trial period conducted with the assistive technology, or the student and/or student's parent(s) were not included in the assessment process.

The intent to fund letter was prepared for grant requests that were approved. The applicants were required to provide (a) a copy of the IEP to the ATIP project, (b) complete an online Student Performance Profile (SPP pre-intervention), (c) complete an online Assistive Technology District Profile, and (d) submit a signed copy of a financial assurances document (Appendix J). The IEP copy was required for teams to demonstrate that they had recorded AT in the IEP document in accordance with the state mandated Model Policies and Procedures for the Education of Children with Disabilities.

Student Performance Profile

Each funded grant was required to engage the school teams in completing the Student Performance Profile-Pre (SPP-Pre) (Appendix K, PDF) prior to using devices obtained with ATIP funding as a pre-intervention measure, and again as a follow-up measure (SPP-Post) (Appendix L, PDF) 8 months to 1 year later.

The Student Performance Profile provides a way to begin analyzing student outcomes resulting from the provision of assistive technology in the schools. In the absence of outcome measurement instruments tailored to widespread use in the school setting, the Student Performance Profile was designed to begin examining important factors that impact educational progress with AT. All funded technology must directly support the student's Individualized Education Plan goals and the rate of progress on these goals must be documented on the Student Performance Profile. The impact of AT on access to and progress in general education is also assessed. Assistive technology outcomes researchers have highlighted the importance of teasing out the impact of assistive technology from the many other interventions that the student is receiving in the school setting.

The SPP identifies 10 factors that may make a contribution to goal achievement for students: natural development; b) compensation for impairment by the student; c) adaptations of specific curricular tasks; d) redesign of instructional environment; e) performance expectations changed; f) participation in general education instruction; g) related and support services; h) personal assistance; i) assistive technology devices; and j) assistive technology services. The Student Performance Profile attempts to identify the unique contribution of AT by requesting teams to rate, on a scale of zero to ten, the contribution of various interventions toward a student's progress in the area of need supported by the assistive technology. By rating the contribution of the different interventions, the teams are isolating the unique contribution that AT is making to the students' progress.

The Assistive Technology District Profile

Districts that received grant funds were also required to complete two Assistive Technology District Profiles. The Assistive Technology District Profile (Appendix M) was an online data collection instrument designed to measure the impact of ATIP on the delivery of assistive technology devices and services in Ohio's schools. The instrument is divided into two sections - one focusing on services and the other on devices. Each district was required to complete an Assistive Technology District Profile the first time an Assistive Technology Infusion Project (ATIP) grant was awarded. In addition to establishing a baseline of assistive technology services and devices, the Profile provided a summary of the acquisition, implementation and utilization of assistive technology in Ohio. A follow-up profile was required to be completed at the end of the project. The data was collected to help guide the state in future policy and resource decisions concerning assistive technology infrastructure, deployment, and professional development initiatives. In addition, local districts can use the data to support their technology planning and to assess the local implementation of assistive technology.

Results

A table summarizing the number of applications and awards (Appendix N) that were made in each of the four rounds of funding provided by ATIP illustrates the consistency of funding across the four funding cycles. Analysis of grant award data indicates that the grant program was successful in achieving its goal (Appendix O) of distributing AT to students with disabilities widely across the state of Ohio, funding applications from more than 525 districts. In the four rounds of funding, 4979 applications were received and nearly 70% were funded.

A requirement of the ATIP process was that requests for AT devices for an individual student total $100 or more. The average amount for grants awarded at level 1 ($3000 and under) was $1594 and the average for those at level 2 (over $3000) was $6200.

Discussion

ATIP demonstrates the success of a large-scale Internet-based data collection system. Although the use of the Internet to gather information is becoming increasingly common, it is less frequent in the field of education. There are a number of benefits that were achieved by using the Internet to administer the grant:

Lessons Learned

Implementation of the ATIP was a learning process. The pace of development of the entire project was necessarily rapid. Because the Ohio Assistive Technology Infusion Project was interested in measuring outcomes, they seized the opportunity to team up with the ATOMS Project staff to collaborate on an outcome measurement system. The project timelines (Appendix B) for notification of funding for both the ATIP and the ATOMS Project demanded the rapid development of the outcome measurement component. Had there been more time available, it may have been helpful to pilot the data collection instruments on a larger scale than was possible prior to widespread use.

One small, but important lesson learned is the importance of providing applicants with a list from which to select their choices. For example, when filling in the application, if participants made spelling errors, used inaccurate names for devices, or were inconsistent in their use of terminology (e.g., EC vs. Early Childhood), subsequent searching and report generation was problematic. Examining each field to determine whether selection from a menu is an option may reduce later problems with data summary and analysis.

Because the people who provide the direct training and implementation in the use of AT seem to play a pivotal role in whether or not a positive outcome is achieved, gathering additional information about the key implementers might have been informative. The person completing the application was required to respond to the question "As you initiate this project, what expectations do you have about how assistive technology might help this targeted student?" However, whether there is team consensus regarding this issue is not clear. Additionally, the degree of training in assistive technology that the student's team had is not readily available. This additional information would be helpful in examining outcomes of the AT.

Applicants were required to identify measurable goals related to the requested technology and to develop a related evaluation plan. The evaluation plan was to include information about the techniques used and the frequency for collecting data to evaluate student progress. This is an area in which teams require ongoing training, and the field may need improved instrumentation.

Concluding Notes

The purpose of this report was to provide detailed information about the ATIP process and to archive the instruments and tools that were generated from this project for other interested states, schools, and researchers who may be interested in replicating the procedures associated with large-scale dissemination and evaluation of AT in the schools. In subsequent reports we will describe research on the outcomes of the ATIP project.

Appendices

Find out more about related work on Isolating the Impact of Interventions (I3) Instrumentation