DEVELOPMENT OF AN ASSESSMENT PLAN
I.
Introduction
A.
“Nine
Principles of Good Practice for Assessing Student Learning” (Astin, A, Banta, T., Cross, P., El-Khawas,
E., Ewell, P., Hutchings, P., Marchese,
T., McClenney, K., Mentkowski,
M., Miller, M., Moran, E., and Wright, B. (2003), American Association for Higher
Education Assessment Forum, AAHE,
1.
The assessment of student learning begins with
educational values.
a.
assessment is a vehicle
for educational improvement.
b.
assessment is driven
by what we most value for students to learn and gain from their experience with
us.
2.
Assessment is most effective when it reflects an
understanding of learning as multidimensional, integrated, and revealed in
performance over time.
a.
learning entails what students know and what they
can do with what they know
b.
use of diverse methods for assessment
c.
use of measurements over time to reveal change and
growth
3.
Assessment works best when the programs it seeks to
improve have clear, explicitly stated purposes.
a.
assessment is a
goal-oriented process.
b.
clear, shared goals and goals that can be
implemented are the cornerstone of assessment that is focused and useful
4.
Assessment requires attention to outcomes but also
and equally to the experiences that lead to those outcomes.
a.
we need to know where students end up, but also how
they develop along the way
b.
assessment helps us see how students learn best
5.
Assessment works best when it is ongoing, not
episodic.
a.
assessment is a process whose power is cumulative
b.
monitor progress toward intended goals in a spirit
of continuous improvement
6.
Assessment fosters wider improvement when
representatives from across the educational community are involved.
7.
Assessment makes a difference when it begins with
issues of use and illuminates questions that people really care about.
a.
assessment results in
evidence that is relevant and people will find the results to be credible,
suggestive, and applicable to decisions that need to be made.
b.
it is a
process that starts with the questions of decision-makers, that involves them
in the gathering and interpreting of data, and that informs and helps guide
continuous improvement.
8.
Assessment is most likely to lead to improvement
when it is part of a larger set of conditions that promote change.
a.
when the campus values continuous improvement,
assessment results will be sought out
9.
Through assessment, educators meet responsibilities
to students and to the public.
a.
we have an obligation to the public to improve
b.
the public has an obligation to support our efforts
to improve
B.
Elements of
Good Assessment
1.
Asks important questions
2.
Reflects the departmental/institutional mission
3.
Reflects identified learning outcomes
4.
Is linked to a plan for decision-making
5.
Encourages involvement
6.
Contains relevant assessment techniques
7.
Is shareable and leads to reflection
C.
Assessment’s
Relation to A Program’s Outcomes: (Bresciani, M.J., NetResults,
1.
What are we trying to do and why are we doing it?
2.
What do we want the student to learn or know as a
result of our program or interaction with our department?
3.
How well are we doing?
4.
How do we know?
5.
How do we use this information to improve?
6.
Does it work?
D.
Potential
Benefits of Assessment: (Bresciani, M.J., NetResults,
1.
reinforce or emphasize the mission of the unit or
department
2.
modify, shape, and improve programs and/or
performance (formative assessment)
3.
critique a program’s quality or value compared to
the program’s previously defined principles (summative assessment)
4.
inform planning
5.
inform decision-making
6.
evaluate programs, not personnel
7.
support the request for additional funds from the
university and external community, and
8.
assist in meeting
accreditation requirements, models of best practices, and national benchmarks.
E.
Beginning
the Discussion
1.
What role do you think assessment has in our
organization?
2.
Describe what you think assessment should
accomplish in our organization.
3.
What role do you think you would like to have in
providing assessment in our organization?
4.
Do you have any concerns or questions about the
assessment process that should be addressed as we begin to develop a strategic
plan for assessment?
5.
What support structures do you feel you would need
to actively participate in conducting assessment projects?
II.
Developing
a Strategic Plan for Assessment
A.
Assessment
begins with knowing what you want
B.
Define your
assessment vision: what do you hope to accomplish?
1.
Where are we going?
2.
What have we accomplished so far?
3.
Define the vision in a short statement that
inspires and motivates others
4. Have a wide-ranging discussion that includes how this assessment vision fits into the larger vision of Student Affairs and the University
C.
Consult
with the Institutional Review Board (IRB) of the University
1. Dr. Sandra Holmes of the Psychology Department is chair of the IRB
2. She will provide guidelines for ensuring protection for participants in research
D.
Review the
type of resources that exist
E.
Decide on a
guiding model, such as the Context, Input, Process, and Product model (The CIPP
Model of Daniel Stufflebeam) of
planning and evaluation (From: Stufflebeam, D. 2001, Evaluation
Models; and Rodgers, R., 1979, “A Student Affairs Application of the CIPP
Evaluation Model”, in Kuh, G. Evaluation in
Student Affairs, 1979).
1.
Examples of other evaluation models
a.
objectives based studies
b.
objective testing programs
c.
outcome evaluation as value-added assessment
d.
performance testing
e.
experimental studies
f.
management information systems
g.
benefit-cost analysis approach
h.
case study evaluations
i.
accreditation/certification approach
F.
Operationalize your plan: assign
responsibilities and roles
III.
Initiating
the Process (Upcraft
and Schuh, 1996, Assessment in Student Affairs)
A.
Do not do a
study that no one wants
B.
Determine
confidentiality
C.
Determine
who should be involved
D.
Conduct a
sound study
E.
Understand
what kinds of information you will need to provide in your report.
IV.
Defining
Your Assessment Vision
A.
Determine
who will be the audience for the results
B.
Determine
the appropriate format(s) of the study
C.
Involve the
key stakeholders in the planning of the study and allow them to review the
instrument drafts
D.
Example of
an assessment vision from
V.
Identifying
Resources
A.
Is money
available for your assessment needs?
B.
Who is
available to analyze your data?
C.
Do you have
the cooperation of department and other key stakeholders?
D.
Do you have
people to assist with data collection?
E.
Can you
offer incentives?
F.
Is your
organization supportive of your efforts?
VI.
Developing
an Assessment Model Tied To Your Planning Model
A.
Describe
your departmental mission, vision, priorities, and values
1.
a.
needs to be concise
b.
what does the department stand for?
c.
what does the department do for our students?
d.
what impact do you have on your students vs. what
programs do you offer
2.
Vision (Example: (“Changing Lives at UWSP”)
a.
compelling
b.
inspirational
c.
calls people to change
d.
offers a “sound bite”
3.
Values
a. what
does your department value?
4.
Priorities
a.
based on your department’s vision, mission, and
values, what are your priorities?
b.
Are these priorities connected to budget decisions?
5.
Goals: general description of ultimate intended
benefits or results or desired outcomes for the program (what core issues and
values does the program address?)
6.
Objectives: measurable statements about specific
intended outcomes that a particular program
activity or service is expected to accomplish in a given time period.
a.
has a target group (who)
b.
has what is to be done (program)
c.
has a time frame (when)
d.
has a target performance (how much)
e.
has a measurement (how it will be measured)
7.
Objectives that are useful should:
a.
tell who
b.
is going to be doing what
c.
when
d.
how much, and
e.
how we will measure it
8.
Checklist for Evaluating Written Objectives (
a.
uses action verbs that specify definite, observable
behaviors
b.
uses simple language
c.
describes student rather than staff behavior
d.
describes a learning outcome rather than a learning
process
e.
indicates a single outcome per objective
f.
can be assessed by one or more indicators (methods)
g.
is clearly linked to a goal
h.
is realistic and attainable
i.
is not simple when complexity is needed
j.
is clear to people outside the department
k.
is validated by departmental colleagues
B.
Example of
A Goal and Objective (Anti-Smoking Program) (From: Grayson, T., “Constructing
Logic Models”, 2000,
1.
Goal: The life-expectancy of all Americans will
increase to 76 years of age by the year 2020 without creating any economic
downturns in the Nation’s economy.
2.
Objective: To reduce the number of all teenage
smokers by implementing a Nationally funded anti-smoking initiative starting in
the year 2000 and continuing until the year 2020, by 98%, as measured by a
stratified random sampling of teenagers each year, beginning in the year 2000.
C.
Example of
Goal and Objective (Get Ready Program) (From: Grayson, T., “Constructing Logic Models”, 2000,
1.
Goal: Ensure that all individuals with disabilities
acquire self-determination skills necessary for gainful employment or
post-secondary schooling after graduation from high school.
2.
Objective: To increase the number of high school
graduates with disabilities securing gainful employment or entering
post-secondary schools within 6 months after graduation by developing and
implementing the Get Ready Program in all secondary schools in
D.
Evolution
of a Good Objective (From: Grayson, T., “Constructing Logic Models”, 2000,
1.
Stage 1: To increase the reading skills of at risk
students (customer and expected result)
2.
Stage 2: To increase the reading skills of at risk
students ages 14 to 18 (specific target)
3.
Stage 3: To increase the reading skills of at risk
students, ages 14 to 18, through tutoring (the program)
4.
Stage 4: To increase the reading skills of at risk
students, ages 14 to 18, through tutoring, as measured by the performance on
the school district’s reading comprehension test (means of measuring results)
5.
Stage 5: to increase the reading skills of at risk
students, ages 14 to 18, through tutoring, as measured by performance on the
school district’s reading comprehension test to be administered before and
after the program (when the results are expected)
6.
Stage 6: To increase the reading skill of at risk
students, ages 14 to 18, through tutoring, and as measured by an average
increase of five percent on the school district’s reading comprehension test to
be administered before and after the program (the standard of success)
7.
Stage 7: To increase the reading skills of 25 at
risk students, ages 14 to 18, through tutoring, and as measured by an average
increase of five percent on the school district’s reading comprehension test to
be administered before and after the program (the number of program recipients)
E.
Describe
your strategic objectives
1.
Learning outcomes
2.
Service outcomes
3.
Program outcomes
4.
Behavioral indicators
5.
Key performance indicators
F.
Describe
the activity or
program that is designed to impact students
1.
A program is an intentional use of resources to
support specific strategies or activities to produce defined results to address
strategic problems to achieve the department’s mission
2.
Activities such as training, outreach, maintenance,
and management are major strategies.
3.
What is to be done to achieve our intended goals
and mission?
4.
Direct products of the program activities could
include number of individuals served or the number of training sessions served
5.
How much do you do and for how many individuals?
G.
Describe
how strategic objectives will be measured
H.
Describe
how reporting of results and feedback from stakeholders will occur
I.
Objectives
Vs. Outcomes
1.
Objectives: intended results or consequences of
instruction, curricula, programs, or activities
a.
they specify what is expected and describe what
should be assessed
2.
Outcomes: achieved results or consequences of what
was learned- evidence that some learning took place
a.
outcomes are behaviors and products generated by
students after the program is delivered and are the object of the assessment.
b.
outcomes should be realistic, achievable, and
directly related to the activities of the program
c.
outcomes can be immediate, short-term, or
long-term.
J.
Types of
Outcomes
1.
Program Outcomes
a.
illustrate what you want your program to do
b.
present measurable and meaningful statements of
what you want the program to accomplish
2.
Service Outcomes
a.
related to program outcomes
b.
Example: Financial Aid- develop a financial aid
package that ensures that students have enough money to enroll and stay
enrolled in college
c.
Example:
3.
Learning Outcomes
a.
illustrate the learning that you want to occur
b.
assess cognitive abilities, such as critical
thinking skills
4.
Developmental Outcomes
a.
assess affective dimensions or attitudes
b.
examples would include being sensitive to the
values of others, becoming aware of one’s own talents and abilities, and
developing an appreciation for life-long learning
K.
Creation of
Learning Outcomes
1.
Learning:
a.
is affected by the educational climate
b.
is an active search for meaning
c.
is developmental- involving the whole person
d.
is done by individuals who are tied to others as
social beings
e.
is enhanced by taking place in the context of a
compelling situation
f.
is fundamentally about making and maintaining
connections
g.
is grounded in particular contexts and individual
experiences
h.
involves the ability of individuals to monitor
their own learning
i.
requires frequent feedback, practice, and
opportunities for utilization
j.
takes place informally and incidentally
L.
Guidelines
for Outcomes
1.
Does each outcome describe what the program or
department intends for students and/or staff to know, think, or do?
2.
Does the intended outcome meet the following
criteria?
a.
detailed and specific?
b.
appropriate to the program or department?
c.
measurable/identifiable?
d.
meaningful in making decisions of how to improve
the program?
3.
Does the program have a component to be able to
deliver/implement each outcome?
4.
Are multiple methods, if appropriate, used to
assess outcomes?
5.
Do the assessment methods include direct and indirect
measures of outcomes?
6.
Is each assessment method or tool appropriate to
the outcome it is evaluating?
M.
NASPA’s Six
Learning Outcome Categories (Based on the Student
Learning Imperative, plus two from Schuh and Upcraft)
1.
Complex cognitive skills: reflective thought,
critical thinking, quantitative reasoning, and intellectual flexibility
2.
Knowledge acquisition: subject matter mastery and
knowledge application
3.
Intrapersonal development: autonomy, values,
identity, asthetics, self-esteem, and maturity
4.
Interpersonal development: understanding and
appreciating human differences, ability to relate to others, and establishing
intimate relationships
5.
Practical competence: career preparation, managing
one’s personal affairs, and economic self-sufficiency
6.
Civic responsibility: responsibilities as a citizen
in a democratic society and commitment to democratic ideals
7.
Academic achievement: the ability to earn
satisfactory grades in courses
8.
Persistence: the ability to pursue a degree to
graduation or achieve personal educational objectives
VII.
Writing
Outcomes
A.
Examples of
Areas for Student Learning Outcomes (
1.
Self Assessment
2.
Critical Thinking
3.
Values
4.
Goal Setting
5.
Confrontation
6.
Current Events
7.
Community Service
8.
Time Management
VIII.
Examples of
Cognitive Development Objectives: Use of Bloom’s Taxonomy
A. Benjamin Boom created a taxonomy that has been
used for writing objectives for lesson plans in educational settings.
B.
That
taxonomy can also be used to write student learning outcomes.
1.
Bloom’s taxonomy defines six levels in which
objectives can be categorized: knowledge, comprehension, application, analysis,
synthesis, and evaluation
2.
Bloom’s taxonomy used heavily in teacher education
IX.
Examples of
Bloom’s Taxonomy (Adapted from Bloom, B.S. (Ed.) (1956). Taxonomy of educational objectives:
The classification of educational goals:
Handbook 1, cognitive
domain.
Competence Skills Demonstrated
Knowledge - observation and recall of
information
- knowledge of dates,
events, places
- knowledge of major ideas
- mastery of subject
matter
- question cues: list,
define, tell, describe, identify, show, label, collect, examine, tabulate,
quote, name, who, what, when, where, etc.
Comprehension - understanding information
- grasp meaning
- translate knowledge into new context
- interpret facts, compare, contrast
- order, group, infer causes
-
predict consequences
- question cues: summarize, describe, interpret, contrast, predict,
associate, distinguish, estimate, differentiate, discuss, extend
Application - use information
-
use methods, concepts, theories in new situations
-
solve prolems using
required skills or knowledge
-
question cues: apply, demonstrate, calculate,
complete, illustrate, show, solve, examine, modify, relate, change, classify,
experiment, discover
Analysis - seeing patterns
-
organization of parts
-
recognition of hidden meanings
-
identification of components
-
question cues: analyze, separate, order, explain, connect,
classify, arrange, divide, compare, select, explain, infer
Synthesis - use old ideas to create new ones
-
generalize from given facts
-
relate knowledge from several areas
-
predict, draw conclusions
-
question cues: combine, integrate, modify,
rearrange, substitute, plan, create, design, invent, what if?, compose,
formulate, prepare, generalize, rewrite
Evaluation
- compare and discriminate between ideas
-
assess value of theories, presentations
-
make choices based on reasoned argument
-
verify value of evidence
-
recognize subjectivity
-
question cues: assess, decide, rank, grade, test,
measure, recommend, convince, select, judge, explain, discriminate, support,
conclude, compare, summarize
X.
Examples of
Developmental Objectives Using Chickering’s Vectors (Erwin, T.
D. (1991), Assessing Student Learning and Development, Jossey-Bass)
A.
Chickering’s Seven Vectors
1.
developing a sense of competence
2.
managing emotions
3.
developing autonomy or independence
4.
establishing identity
5.
freeing interpersonal relationships
6.
clarifying vocational and life purposes
7.
developing integrity
B.
Examples of
Applications of Chickering’s
Vectors in Writing Developmental Objectives
1.
Financial Aid: to assist students in planning a
budget and help them to realize the value of money (value formation and
autonomy)
2.
Career Services: to help students study,
experience, and explore various career options; to help students make career
decisions (clarifying vocational and life’s choices)
3.
4.
Admissions: to help students develop an
appreciation for diversity by admitting diverse students (freeing interpersonal
relationships and developing integrity
5.
6.
Campus Activities: to help students develop a sense
of identity through involvement with organizations and attendance at workshops,
lectures, etc. (sense of identity)
XI.
Objectives
For Skill Development
A.
Transferable
Skills from a Liberal Arts Education (
1.
critical thinking
2.
creative thinking and problem-solving strategies
3.
effective writing
4.
effective oral communication
5.
quantitative analysis
6.
computer literacy
7.
library and information technology competence
8.
values awareness
XII.
Example of Applying
Student Development and Learning Outcomes to Career Services (Freeman, J.
P., Bresciani, M. J., and Bresciani,
D.,
A.
First, let
us start with a sample satisfaction outcome statement: “ ___% of the Career
Service participants will agree or strongly agree that career service programs
provided information and assistance that were helpful in their preparation to
leave the university.”
B.
Next, in
order to assess this satisfaction outcome, you could administer a self-report
satisfaction survey.
C.
You could
also use focus groups or interview students individually in person or do a
telephone survey.
D.
Next, you
can ask these questions
for Career Services:
1.
Will this outcome and assessment method help me
understand what it is that I am doing that is leading to the outcome?
2.
Will this outcome and assessment method help me
understand why I am doing what I am doing?
3.
Will the evidence collected from this method help
me make the decisions I need to make about my program?
E.
Next,
consider taking the sample satisfaction outcome and expanding it with student
development outcomes in mind:
1.
Students will demonstrate appropriate interview
skills during videotaped mock interviews.
2.
Students will articulate a high level of confidence
in their career choice.
3.
Students will document their qualifications for a
position in their resume and performance portfolios.
F.
In these
previous examples, you are assessing student learning and development as an
outcome of what your program is trying to accomplish.
G.
Next, you
would consider the manner of evaluating these outcomes through several possible
assessment methods:
1.
Self-report Survey
2.
Interviews based on criteria
3.
Observations based on criteria
4.
Standardized career service assessment instruments
5.
Student Portfolios
6.
Peer evaluations
7.
Self-evaluation
H.
Next, apply
the following questions to the outcomes:
1.
Which outcome and assessment methods will help me
understand what it is that I am doing that is leading to the outcome?
2.
Which outcome and assessment methods help me
understand why I am doing what I am doing?
3.
Will this kind of evidence help me make the decisions
I need to make?
I. Next, examine the criteria
that you have established and the students’ performance to see if they are
demonstrating that they have satisfactorily met that criteria:
1. If the students
are performing adequately in that area, you can feel comfortable with that part
of your program
2. If the
students are not performing as well as you would like in this area, what
adjustments or changes to the program would you make to aid them in being more
successful?
XIII.
The
Measurement Challenge
A.
Collect
information that will enable program improvement and communicate value, as well
as influence new program development
1.
Start with short-term outcomes
2.
Keep an eye on strategic outcome (solving long-term
problem)
3.
Collect explanatory information on program
implementation, feedback, and external influences.
4.
Examine the relationship between what you did, what
you achieved, and the context
5.
Remember the relationship between short term,
intermediate, and strategic outcomes
XIV.
Measurement
and Program Management
A.
Key
Questions
1.
What long term, strategic problem are we trying to
solve?
2.
What causes the problem?
3.
What part of the problem are we addressing?
4.
Who are we serving?
5.
What do we offer those we are serving?
6.
How are they changed?
7.
What will these changes enable them to do
better/differently?
8.
What do the intermediate outcomes lead to? (our
longer term outcomes, usually the problem to be solved)
9.
What do we do to produce these outcomes?
10.
What resources to we need to implement these
outcomes?
11.
What are the external conditions that could
influence our success?
12.
What evidence will we need to determine what is
working, what is not?
13.
What evidence will I need to tell my program’s
success story to others?
XV.
Measuring
Outcomes
A.
Examples (
1.
Living/learning opportunities for student learning
a.
goal: provide living/learning opportunities that
promote student learning
b.
objective: living in housing will be conducive to
residents’ academic success
c.
outcome measures: (1) First Year Experience
participants will be retained at a higher rate than the retention rate of all
freshmen; (2) the average GPA for First Year Experience participants will
exceed the average GPA for all new freshmen.
2.
Living/learning opportunities that encourage
personal growth and community development
a.
goal: provide living/learning opportunities that
encourage personal growth and community development
b.
objective: living in housing will encourage personal
growth and community development
c.
outcome measures: seventy percent of respondents to
the University Educational Benchmarking Survey will be satisfied with their
fellow residents’ attitudes on diversity
3.
Safety for Residents
a.
goal: strengthen the campus community by providing
safe on-campus living environments
b.
objective: residents will feel safe in the halls
and security breaches will be rare
c.
outcome measure: according to the Police and Fire
Department activity logs, there will be no loss of life or serious incident
injury resulting from a breech in residence hall security.
XVI.
Operationalization of the Assessment Plan
A.
Assign
specific tasks and roles
B.
Decide on
formal and informal reporting plan
C.
Identify a
dissemination plan
D.
Use your
experience to revisit your overall strategic plan for assessment
E.
Process
Questions
1.
Do people feel they have the training and knowledge
to carry out the assessment?
2.
Do people feel they have the support to move
forward with the assessment plan?
3.
Is our assessment plan being used to continuously
improve our programs and services?
4.
Are we viewing the assessment plan as a process,
rather than a work written in stone?
XVII.
Implementation
Considerations
A.
Sampling
(See “Conducting and Administering Surveys”)
1. Probability Sampling
a.
Stratified sampling- represents the survey
population based on specific variables, is random, and is representative of the
larger population
b.
Random sampling
2.
Non-probability sampling
a.
quota sampling
b.
purposive sampling
c.
convenience sampling
3.
Size of the sample (Sommer,
B. and Sommer, R., 1997, A Practical Guide
to Behavioral Research)
a.
size of the population: larger the population,
larger the sample generally speaking (sample of the freshman class vs. sample
of student staff in Campus Activities)
b.
available resources and time constraints:
researcher’s time availability, budgetary resources, etc.
c.
strength of the effect: smaller samples are good
for stronger or more straightforward effects
d.
number of statistical analyses to be performed:
multiple statistical comparisons require larger samples (example: need a larger
sample size if the researcher is subdividing the sample into social class, age,
ethnicity, and gender)
e.
refusal and spoilage rates: larger sample size if
the researcher anticipates some data may not be usable or low return rate
(example: a group of student employees surveyed at their work site may have an
80% return rate vs. a mailing to an unselected group of individuals may have a
return rate as low as 10%.
B.
Use of Multiple
Methods of Collecting Information and
Data (Sommer, B. and Sommer, R. 1997, A Practical Guide to Behavioral Research
and Bresciani, M., Zelna,
C., and Anderson, J., Assessing Student Learning and Development: A Handbook
for Practitioners, 2004, NASPA)
1.
Surveys (See “Conducting and Administering
Surveys”)
2.
Focus Groups
3.
Use of tracking and counts on program participants
and users of services
4.
Observations (casual observations, systematic
observations,
5.
Individual
Interviews (unstructured interviews, structured interviews, semi-structured
interviews, telephone interviews, etc.)
6.
Standardized Instruments
7.
Personal Documents: (research diaries, activity
logs, personal diaries and journals, life histories, such as autobiographies,
biographies, etc.)
8.
Case Studies
9.
Portfolios
10. Simulations (environmental simulations, games,
role-playing, etc.)
11. Benchmarking Examples
a.
National Survey of Student Engagement (Center for
Postsecondary Research and Planning,
b.
College and University Counseling Center Directors
Data Bank (The Benchmarking Exchange,
2003)
c.
The Association of College and University Housing
Officers-International Benchmarking Project (Educational Benchmarking, Inc.,
2003)
d.
National Association of Colleges and Employers
Career Services surveys (1998)
e.
The National Survey of
f.
National Association of College and University
Business Officers (NACUBO) Benchmarking Project: covers areas such as
admissions, bookstores, financial aid, registration and records, etc.
g.
National Association of College and University Food
Services (NACUFS) Benchmarking Project
C.
Use of
Statistics for Analysis (See: “A Brief Comparison of Quantitative and
Qualitative Methods”)
1. descriptive statistics: mean, mode, median, standard deviations, etc.
2.
Inferential statistics : correlations,
analysis of variance, chi squares, etc.
XVIII. Results and Implications
A.
Report
Writing
1.
vehicle for communication of assessment data
2.
can be formal or informal
3.
should accomplish the following:
a.
link data to decision making
b.
connect data to continual redefinition of strategic
goals
c.
connect data to continual redefinition of learning
outcomes
d.
provide means to improve our impact on students
4.
Questions for the report writing
a.
will readers already be familiar with the study, or
will you need to start from scratch?
b.
Do readers have the time to review an extensive
report or will they want a short summary?
c.
Will readers want only your findings, or will they
want to know how you arrived at them?
d.
Are readers knowledgeable about research methods,
or will you need to explain them?
e.
Are readers likely to be friendly or unfriendly
toward the results? Can you anticipate criticisms?
f.
Are readers likely to be questioned about the study
by others?
B.
Informal
Report
1.
Distribute more widely than formal report
2.
Can serve to start discussions throughout campus
3.
Suggested format:
a.
title
b.
summary of project
c.
response rate and brief description of the sample
d.
4 or 5 questions and responses (use graphical
representation when possible)
e.
contact information available for people who want
to follow up
C.
Formal
Report
1.
Front Cover
a.
title of study
b.
name of individuals involved
c.
date
2.
Summary
a.
what was assessed?
b.
when did assessment project occur?
c.
why was the project conducted?
d.
highlights of major findings
3.
Background Summary
a.
relevant literature, when appropriate
b.
goals/history of program
c.
students,
faculty, and staff involved
4.
Description of the Study
a.
design of the study
b.
data collection process (including sampling
technique)
5.
Results
a.
description of the program
b.
summarize demographic data of respondents
c.
summary of statistical results and qualitative
themes
6.
Discussion of Results
a.
what did you learn?
b.
limitations of the study
c.
recommendations for program studied
d.
recommendations for future research