Oakland Unified School District
The evaluation of the
Urban Dreams project is a cooperative undertaking by district personnel, partner
agencies, and an external evaluation group.
Together, staff members from these agencies have been involved in most of
the evaluation activities including the development of annual local evaluation
plans and the project’s discrepancy evaluation model.
The Oakland Unified
School District contracted with the Center for Evaluation and Research LLC (C.E.R.)
from the outset of the project to facilitate the evaluation of the Urban Dreams
project. C.E.R. coordinated local
evaluation efforts to furnish process and summative information to the project
staff with the goal of validating successful practices and providing for
informed decision-making. Three
C.E.R. evaluators currently work on the project along with a four person support
staff. The three evaluators are
Matthew Russell Ed.D, Carla Piper Ed.D, and Rachelle Hackett Ph.D.
The local evaluation
plan (attached) consists of an outline of general evaluation activities that are
to be accomplished each year. The
plan serves as a general guide that directs evaluation efforts.
The following is an overview of the major evaluation activities for each
1. Student Academic Achievement and Technology Proficiency: Student achievement is the principal goal of Urban Dreams. The evaluation is involved in tracking and analyzing students participating in Urban Dreams’ classrooms at all the participating high schools. Evaluation activities this past academic year include: a) development of representative samples of Urban Dream students and non-participating students using random selection techniques, b) collection of standardized test and state standards information, and c) comparison analysis between groups.
2. A major outcome of the project is student technology proficiency. Student technology proficiency is supported by the dissemination of technology in classrooms and in homes. Evaluation activities related to student technology proficiency include: a) development, dissemination, and collection of the Student Technology Proficiency Inventory (STPI) to a representative sample of students in Urban Dreams’ classrooms and non-project students, b) comparison analysis of technology proficiency between groups, and c) development and submission of a proposal outlining the survey results.
3. Staff Development: Urban Dreams is providing ongoing professional development for over 120 teachers in the areas of technology, language arts and history. Evaluation activities include: a) surveys and interviews with teachers, b) creation of a lesson plan rubric that guided the lesson plan development by project teachers; c) analysis of curricula and instructional materials developed by the participating teachers, d) the creation of preliminary video case studies, and e) analysis of project sponsored workshops.
4. Community Involvement and Technology Access: The project is providing technology directly to classrooms and homes. Evaluation activities include: a) structured interviews with project staff and collaborating partners providing these services, b) analysis of community-based technology trainings, and c) follow-up telephone interviews with families who have received refurbished computers.
The project evaluation team
developed the following logic model that provides a graphic representation
between program activities and proposed outcomes:
Figure 1. Urban Dreams’ evaluation logic model
Evaluators and project
staff developed a discrepancy evaluation model (DEM) that outlined in detail
each of the evaluation activities. The
DEM includes program narratives, component maps and input, process, and output (IPO)
statements for each program component on two increasingly detailed levels.
The plan is currently available on the evaluation website that is
maintained by C.E.R. (http://californiaschools.net/ud).
The plan is periodically updated to reflect not only original resources,
goals (outputs) and processes, but to reflect discrepancies in intended outputs
(which can be positive or negative in relationship to the original intents).
The DEM serves as an ongoing narrative of the project’s activities
related to its original goals and objectives, as well as a roadmap that
evaluators use to analyze the status of the process and summative evaluations.
Instruments: Appropriateness, Reliability and Validity
The project evaluation
utilized a variety of instruments to gather information on program processes and
impacts. Data collection methods
included surveys, workshop evaluation forms, telephone questionnaires,
observation protocols, one-to-one interviews and focus groups. Evaluators, project staff and teachers collect the data.
During this past year
(2002-2003), the evaluators did a follow-up survey and held interviews with
teachers from the first two cohorts. The
evaluators made direct contact with teachers and provided stipends to enhance
participation. The goal of this
data collection effort was to determine how teachers were integrating their new
resources within their instructional programs.
measures that are used in the evaluation of professional development are:
1. video case studies of how teachers are integrating technology into their content areas;
2. teacher and parent participation records and workshop evaluation forms; and
3. observations, interviews and focus groups with project staff, teachers, parents and collaborating agency personnel.
Another major focus of
the evaluation has been the collection and analysis of student data.
Over a ten-month period, the evaluators worked with district’s
technical staff to obtain access to district demographic and test data.
The district grants office provide substantial assistance to the
evaluators in providing student achievement data.
The two principal student data elements collected this year were:
1. Stanford Achievement Test (SAT/9), (California’s mandatory state standardized testing system, includes reading, language arts and social studies sub-tests) and the new STAR proficiency levels. Activities included sampling of project and non-project students and subsequent between group comparison using analysis appropriate statistical analysis; and
2. development, administration and analysis of the Student Technology Proficiency Inventory.
Community involvement and technology access were measured
through a combination of qualitative and quantitative methods including: a)
telephone survey of parents having received a computer through the projects
Take-Home Computer program; b) analysis of community technology trainings; and
c) one-to-one interviews with administrators of collaborating agencies.
The project evaluators also undertook the analysis of a school site
survey that was administered in accordance with state guidelines to determine
the use of technology in the target schools.
No Child Left Behind
The evaluation has taken seriously the changes in the federal legislation particularly in regards to the utilization of more rigorous “scientifically based research” methods. In response to this the project adopted a quasi-experimental approach to analyze student academic and technology proficiency. This approach meets the definition of scientifically based research, as defined in Title IX of the reauthorized Elementary and Secondary Education Act. Specifically, the evaluation of has meet the following five criteria and is waiting for approval on the sixth criterion:
systematic, empirical methods that draw on observation and experiment;
data analyses that are adequate to test the stated; questions and provide a
justification for the general conclusions drawn;
measurements that are reliable and valid;
quasi-experimental design with appropriate controls;
sufficiently detailed to be replicated; and
Findings from the
student technology survey have been submitted as a proposal to the American
Education Research Association
The evaluation for the 2002-2003 academic year will include
a similar quasi-experimental approach for the project’s professional
The project stakeholders met regularly with evaluators to
plan and discuss evaluation findings. Evaluators
met, on average, one day each week with project, school or collaborating agency
staff. The evaluators averaged
eight other communications (via e-mail, mail, telephone and fax) with project
staff, teachers, parents and agency staff each week. The evaluators also
maintained a comprehensive web presence with monthly updates at http://californiaschools.net/ud.
The web site includes all of the process reports, summative report narratives,
evaluation plans and the new video case studies.
© Copyright 2002 Center for Evaluation and Research, LL