Saturday, March 12, 2011

Logic Model - Band 2011

I think I did it! I think the following is the address of where my logic model is.  I hope it works.

http://www.scribd.com/doc/50616270/Logic-model-Band-2011

Assignment #3

Programme Evaluation Worksheet
Engage Stakeholders
1. Who should be involved?
The teacher or principal, would need to deliver questionnaires and interviews.  (As I am the teacher, I would prefer someone other than myself to administer the questionnaire and hear the interviews, as I want the results to be as objective as possible, however this is my assignment.)  Administrators including Division, Teachers/Staff, Custodians, Students, and Parents would be asked to complete the questionnaire.  I would organize the information and prepare it for presentation to the teachers/staff, administration, students and parents.  (I would also like to ask community members who are affected by our band programme to be part of the study, but at this point time will not allow it.)
2. How might (are) they be engaged?
The School Division is already involved as it holds a Band information meeting, Beginner Band Clinic, hires teachers, maintains and rents instruments, and purchases music.  Administrators would continue to be involved by supporting band through time tabling, discipline, and concert attendance.  Teachers/EA’s will schedule subjects to support the programme and by having a place to keep instruments in class, allowing students to go to band, support at concerts, festivals etc., and demonstrating a positive attitude.





Focus the Evaluation
1. What are you going to evaluate?  Describe the program (logic model)
Band has been running since September 2009.  My goal is to see if students, parents, teachers and administrators view the band programme as going well, or not.  I would also like to know if students/parents, teachers, administrators are enjoying having a band in school.
Grade five students are introduced to band instruments and given the opportunity to try them to see which ones they prefer.  If they wish to be in band, they choose an instrument and get parental permission to join the programme.  Parents arrange for rental or purchase of instruments.  (See Logic Model)
2. What is the purpose?  – I would like to see what is going well in the programme, and what could be improved.  I would also like to garner suggestions as to what and how programme improvements could be made.
3. Who will use the evaluation?
 The primary person to use the information is the band teacher (me), then principal, students, and parents.  The principal, teachers and I will incorporate suggestions for improvement regarding teaching, and programme planning of those characteristics that can be changed.  The Principal will support the programme and improvements while students will participate in and accept changes.  Parents will support the programme and improvements.  The Division will also be given a copy of the evaluation to use in the evaluation of other band programmes.
4. What questions will the evaluation seek to answer?
1. What is going well?
2. What could be improved?
3. How could suggestions be implemented?
4. Programme level of satisfaction?
5. Are band programme goals of the students, parents and division being met?
In order to answer the questions I will need to provide a questionnaire and/or interview students, teachers, parents and administration regarding elements that are going well, and elements, which could be improved.  To encourage improvement of the programme I will ask for suggestions from students, teachers, parents, and administration.  This will give me information I need to answer the questions.
5. When is the evaluation needed?
The test surveys will need to be returned to me by the end of May, so the evaluation can be completed before the end of the school year, compilation and organization of the data by the end of June.  This would allow time for suggestions to be implemented in the 2011/2012 school year. (Pilot study surveys by the end of April.)
6. What evaluation design will you use? 
As the programme is already established, outcome evaluation will be used.  I want to see what is going well, and receive suggestions for improvements.  I also want to know if utilizing suggested programme changes improves programme quality. 

Collect the information
7. What sources of information will you use?
This programme has only been running for 1.5 years, so what little current recorded information there is, is available through observation from administrators, parents, students and teachers.  A more thorough evaluation will be done through surveys and interview of school administrators, students, teachers/staff and parents.  The principal has agreed to administer the survey to the students during school, and collect them when finished.  Interviews will be taped and transcribed.  All of the information will be collected during the programme.
8. When will you collect data for each method you have chosen?
All of the information will be collected during the programme.  By the end of April, (hopefully) I will use a random sample (every third person) from another school I teach at as a pilot for the survey.  During class time in the first 2 weeks of May, students and their parents, grade 5 and up, will be asked to complete the survey. This will allow me time to edit the survey to improve viability before the May deadline.  Interviews of teachers and school administrator will be done after school.

Analyse and interpret
9. How will data be analyzed?
Survey question answers regarding strength will be categorized and tabulated by hand, as will those regarding weaknesses and suggestions.  Ratings of Likert scale questions will be tallied and organized according to the quantity of each representative number.  Yes/no questions would also need to be tallied.  All suggestions will be listed recording the number of times the suggestion was made, and then a mean calculated. 
Interview data will be organized and listed into three main categories – strengths, weaknesses, and suggestions for improvements.
10. How will data be interpreted, and by whom?
After the information is organized, the principal and I will interpret the findings as to strengths, weaknesses and areas for improvement the programme.  The information will also be shared with other teachers to gain insight into possible improvements.
 11. Use the Information
A summary of final results done by me would be shared with staff at the first staff meeting, students in first September’s class and parents through a September letter.  This communication would explain survey results, and proposed changes.  Board administration would also be sent results.
It is important to note that a follow-up evaluation would need to be done approximately 6 months to 1 year after proposed changes, to monitor success of modifications.  (This, however, could be done more casually through observation, testimonials or interviews.)
12. Manage the Evaluation
There is no need for protection of anyone except perhaps the teacher, me, if the programme does not meet the established goals. Who is responsible for what and when has already been addressed and as there is no budget other than photocopying, evaluation management is complete.
The evaluation will be effectively completed without lapses in propriety due to prior perusal by the principal. Involving the principal throughout the evaluation will also maintain accuracy of information and results.

I teach a new band programme in 2 other schools.  The culture in all three schools is completely different.  I would like to also evaluate these other programmes to see how they could be improved.  I would like to take advantage of the effort I have already made to make our band programmes as successful as possible.





Saturday, January 29, 2011

Assignment #2


ECUR 809 – Assignment #2

I was rather disappointed and confused in regard to what part of this programme was being evaluated.  Given the number of possible goals for the programme, the goals I choose to focus on are: 1.  exercise could be a primary way to prevent type 2 diabetes; 2. that as First Nations women have a higher average of GDM than non-Aboriginal women, it would be beneficial to see if a prenatal exercise programme offered to First Nations women would positively reduce the level of GDM, and type 2 diabetes in Canada’s First Nations population.
The Stufflebeam CIPP model could effectively evaluate the success of this programme with the collection of more quantitative data by tracking the GDM results of the participants in the study as well as a test group who were not.  The Stufflebeam model looks at the broad picture, systematically collecting information (I-input) about activities and outcomes, which give relevance to information.  Programme content (C), there was a plethora of that, was thoroughly outlined in the document, as was the process (P).  Formative evaluation could occur during the programme, promoting accurate, prompt GDM testing results.  Summative evaluation would need to be ongoing for several years as type 2 diabetes often shows up several years after birth.  Data organization, or product (P) found in the results, could be used to make improvements during the course of the programme.
Stufflebeam’s model is based on making decisions.  Even though no decisions were actually given in the example, it would be the next step taken after evaluating data.  Positive or negative findings of the programme would require a decision to maintain, improve or cancel the programme.
Including the Provus-Discrepancy characteristics of improving the model and giving greater accountability to educators and public, would also be an important part of this evaluation. Increased public knowledge of results would hopefully lead to wiser and better decisions by health administrators.  A cost analysis should be done to determine whether programme costs were less than original health costs, all features of the Provus-Discrepancy model. 
If the Stufflebeam and Provus-Discrepancy model were used together, the exercise programme could be effectively evaluated to see if there was indeed a benefit to continuing the programme to reduce GDM and type 2 diabetes in the First Nations population . 

Saturday, January 22, 2011

Assignment #1


An Evaluation of Elementary School Nutrition … Jennifer Sherry

Recently emphasis has been placed on the health of our children, specifically pertaining to obesity.  This what led to the Evaluation of Elementary School Nutrition Practices and Policies in a Southern Illinois County by Jennifer Sherry.  Obesity had been associated with increased adult mortality through heart disease, high blood pressure and stroke and as a major factor in obesity is diet, school breakfast and lunch programmes were targeted.
More than 84% of children exceeded the United States national recommendations for daily total fat and sodium intake and although it was acknowledged that school diet was not the only contributor, school diet did affect students’ immediate future health.  Teachers, dietitians, school nurses and food services supervisors needed to work together to encourage the best health possible for their students.  To ensure this, school breakfasts and lunches needed to be nutritious and appealing to the students to offset skipping meals or choosing unhealthy alternatives. 
In this evaluation, Sherry proposed to assess elementary school nutrition, nutrition cafeteria practices, credentials of school food service managers and their ability to collaborate with teachers regarding proper nutrition.  This was to be done using the School Health Index (SHI). 8 of 14 rural Illinois public elementary schools gave permission for the evaluation.  The food services manager was contacted to establish a one-hour interview.  Although there were 8 schools in the survey, there were only 3 different managers as 2 managers were responsible for multiple schools.  14 questions were used to collect information on strengths and weaknesses of each of the school’s programmes.  A pilot study was done first to check the validity of the test. After adaptations were made, the managers were interviewed, then given the survey 1 week later. The surveys were scored using a “Likert-type” scale, 1 = under developed; 2 = partially in place; 3 = fully in place.  The scores were calculated as a percentage.
Results of the survey showed that a federally acceptable programme, with decreased sodium and fat, was fully in place in 4 of the 8 schools. 4 of the 8 schools did not promote healthy eating practices by announcements or posters; staff of 4 of the 8 schools had emergency training with food allergies, choking and natural disaster.  4 schools had 1 or 2 methods in the classroom to reinforce healthy eating.  5 schools had only a credentialed manager, 3 had a manager who had a masters degree in Home Economics.
On first perusal, it appears that the study was thoroughly done.  It was formalized and description of what occurred was included.  The purpose, to assess elementary school nutrition programmes, was clearly outlined, and there was a strong connection between the result and the purpose of the survey.  All of these points are part of the Stake Countenance Model.
Part of the survey was to garner suggestions to improve the existing programme, a characteristic of a Provus Discrepancy Model.  Accountability existed between the school and the government as the taxpayers were financing programmes.  Justification was needed for continue support of the programme.  The Stake Countenance Model and the Provus Discrepancy Model were the two main models utilized in this survey.
The survey itself was organized, the goals were clearly outlined, a pilot test with a school of the same size and social climate of the other schools was used.  The survey itself was based on an apparently “proven” model established and used repeated by United States schools interested in evaluating their nutritional programmes.  Liabilities occur in that only 8 out of 14 schools were used in the survey, just over half, which is a very small sample.  Are there only 14 rural counties in Illinois or are there more that could have been surveyed? I would also question that the only people who were surveyed were the managers.  All 4 of them!  All of the information accumulated was from 4 sources.  The survey showed that 1 manager had more than a high school education.  Were the programmes at the schools that had the manager with the master’s degree programmes that were “fully in place”?  Also, the report showed that 4 of 8 programmes had “reduced” their sodium and fat intake.  It did not show that the intake was equivalent or less than the federal requirement of 574 mg.  
Yes, the programme showed that there could be more collaboration between nurses, managers and teachers; that more planning and discussion was necessary for schools to continue to develop more healthy eating habits.  However, was there not an easier, cheaper way to glean this information?

Retrieved from: file:///Users/kathleenwickenhauser/Desktop/809/An%20Evaluation%20of%20Elementary%20School%20Nutrition%20Practices%20and%20Policies%20in%20a%20Southern%20Illinois%20County%20—%20The%20Journal%20of%20School%20Nursing.webarchive