ECUR 809 Sample survey questions; 1st Draft
Note that I did this originally in Word and it did not transfer well. I will post it anyway but will send Jay the original attachments directly.
Statement:Strongly Agree/Somewhat Agree/Neutral/Somewhat Disagree/Strongly Disagree
1. Since the TIPS for Residents course, I am more confident in teaching encounters.
2. Since the TIPS for Residents course, I am more effective in teaching encounters.
3. The TIPS for Residents course helped me improve my teaching skills.
4. I feel I am better equipped to teach after attending the TIPS for Residents course.
5. The TIPS for Residents course helped me improve my knowledge of effective teaching.
6. I am a more effective teacher as a result of attending the TIPS for Residents course.
7. As a result of the TIPS for Residents course, I am more aware of the need to use effective teaching strategies to enhance learning.
Short answer:
1. Describe any changes you have made in how you prepare for teaching sessions as a result of the TIPS for Residents course.
2. Describe how your knowledge of teaching and learning has changed as a result of the TIPS for Residents course.
3. Explain the most important thing(s) you learned at TIPS for Residents.
ECUR 809 Sample survey questions; 2nd Draft
Statement:Strongly Agree/Somewhat Agree/Neutral/Somewhat Disagree/Strongly Disagree
1. The TIPS for Residents course helped increase my confidence during formal teaching encounters (ex: rounds-like presentations, structured lessons, etc.).
2. The TIPS for Residents course helped increase my confidence during informal teaching encounters (ex: non-structured encounters with medical students/JURSIs/junior residents).
3. The TIPS for Residents course helped increase my effectiveness in teaching encounters.
4. The TIPS for Residents course helped me improve my teaching skills.
5. I feel I am better equipped to teach after attending the TIPS for Residents course.
6. The TIPS for Residents course improved my knowledge of effective teaching.
7. The TIPS for residents course has helped me teach more effectively.
8. The TIPS for Residents course has made me more aware that my role as a teacher is to enhance learning.
Short answer:
1. Describe any changes you have made in how you prepare for teaching sessions as a result of the TIPS for Residents course.
2. Describe any changes to how you teach a session as a result of the TIPS for Residents course.
3. Describe how your knowledge of teaching and learning has changed as a result of the TIPS for Residents course.
4. Explain the most important thing(s) you learned at TIPS for Residents and why this is important to you.
5. Please provide suggestions for how the TIPS for Residents course can be improved.
Rationale for Changes
I had feedback from co-workers on various aspects of my questions and altered them accordingly. Although I did not mean for this to be a comprehensive survey (only a sample of some possible questions) a couple comments pertained to covering more of a certain topic and I did oblige by creating a couple additional questions. In fact, some of the questions are quite similar but part of my exercise was discovering the best way to make the statement to get at the type of response I was interested in.
Rating Scaled Questions
I changed the wording on several of these statements to try and clearly articulate that I am trying to assess that the changes being referred to were a result of TIPS. In addition, I decided to try and make them as consistent as possible. I added #2 as it was suggested that it would be good to differentiate among various types of teaching encounters, both informal and formal. I changed # 6 specifically as it was the one question that focused more on the person and her role as opposed to skills/knowledge. I followed a few additional tips to cut down wordiness and make the statements more precise and consistent. I decided that it might be a good idea to italicize the important distinguishing words in each statement to make it clear what is exactly being referred to. I would like feedback on whether or not this is a good idea.
Short Answer
I added question #2 due to a request to ask about what the residents might actually do differently in addition to how they prepare. I added the second part to #3 (now #4) to get a more full and useful response from residents. I added the final question to address the issue of program improvement. I also added italics here for the same reason as stated above.
Wednesday, November 18, 2009
Wednesday, October 21, 2009
ECUR 809 Assignment #4
ECUR 809 Assignment #4 Logic Model
I chose a standard template for my project logic model as it seemed to fit quite well with the program I am describing. The overall goal of the TIPS for Residents program is to help residents become better teachers and more confident when they are in teaching sessions. The objectives relate to the actual skills and knowledge necessary for this and are the objectives listed in the TIPS for Residents manual. The activities describe the general set up of the workshops and how they are run. There is a slight blurring of the lines regarding outputs, outcomes, and impacts. The outputs are closely related to the objectives as these are the skills and knowledge that can be measured immediately upon completion of the workshop activities. Outcomes are better thought of as competencies achieved by residents as a result of the skills and knowledge they have learned in the workshops. These have to do with whether or not residents are actually teaching more effectively and how confident they are in these episodes. Impacts flow from this and basically are related to longer-term results of the workshops. Ideally, if residents are more confident and comfortable in teaching, and have the skills and knowledge to do it well, their experience will lead them to continue good teaching practices throughout their careers.
Unfortunately, I am a technological neophyte so I may need to send my logic model directly to Jay.
I chose a standard template for my project logic model as it seemed to fit quite well with the program I am describing. The overall goal of the TIPS for Residents program is to help residents become better teachers and more confident when they are in teaching sessions. The objectives relate to the actual skills and knowledge necessary for this and are the objectives listed in the TIPS for Residents manual. The activities describe the general set up of the workshops and how they are run. There is a slight blurring of the lines regarding outputs, outcomes, and impacts. The outputs are closely related to the objectives as these are the skills and knowledge that can be measured immediately upon completion of the workshop activities. Outcomes are better thought of as competencies achieved by residents as a result of the skills and knowledge they have learned in the workshops. These have to do with whether or not residents are actually teaching more effectively and how confident they are in these episodes. Impacts flow from this and basically are related to longer-term results of the workshops. Ideally, if residents are more confident and comfortable in teaching, and have the skills and knowledge to do it well, their experience will lead them to continue good teaching practices throughout their careers.
Unfortunately, I am a technological neophyte so I may need to send my logic model directly to Jay.
ECUR 809 Assignment #3
Assignment #3; ECUR 809; Evaluation Assessment
TIPS for Residents; College of Medicine, University of Saskatchewan
Engaging Stakeholders:
TIPS for Residents is a program delivered by Educational Support & Development in the College of Medicine, University of Saskatchewan. It became a required course for residents upon the approval of the Faculty Council. Its purpose is to prepare residents to teach medical students (undergraduates) effectively. A secondary goal is to help residents to be, and feel, more confident about teaching and learning presentations they are required to do. Thus, the Undergraduate Medical Education Department is the primary stakeholder. Because of the commitment of residents in this program, postgraduate medical educators are involved and interested in the delivery and success of TIPS. Finally, the creators of TIPS for Residents, the facilitators of the workshops, and Educational Support & Development in general all have a vested interest in TIPS for Residents. Thus, the stakeholders to consider with this program include the Postgraduate Dean, Postgraduate Program Directors, Director of Undergraduate Medical Program, residents (represented by the Chief Residents), medical students, and staff and faculty in Educational Support & Development involved in creation and delivery of the program. All of the stakeholders (except for medical students) have been contacted and asked to provide input regarding the following three questions:
1. What are the primary goals and objectives of the TIPS for Residents program?
2. How would you measure/determine success in these goals/objectives?
3. What information emerging from a program evaluation of TIPS for Residents would you find to be the most valuable?
Some responses have been collected and others will continue to be gathered to help create a focus and direction for the program evaluation.
Focus:
The purpose of the program evaluation is to determine the success of TIPS for Residents and address possibilities for improvement. As such, this will be an outcomes evaluation, concentrating on goals and objectives. While the satisfaction of all stakeholders is of importance, the primary users of the program are the residents who are participants in the workshops. The direct beneficiaries are medical students who should experience more successful learning as a direct result of better resident teaching. A more complete description of the program can be found in the logic model to follow (assignment #4). The key questions to ask have to do with the residents’ knowledge of teaching and skill improvement as a result of TIPS for Residents. Are residents becoming better educators and becoming more confident in teaching as a result of TIPS for Residents? What aspects of TIPS are helping in this regard? Are there ways to alter the program to improve residents’ skill and knowledge as educators?
Data Collection:
Ideally, it would be most beneficial to discover if their teaching skills improve as a result of the program but this would be logistically extremely difficult to measure. However, if the quality of resident teaching is improving, it should be revealed by interviewing medical students and getting their feedback about strengths and weaknesses in this regard. Insight could also be provided by interviews with Program Directors and other faculty to see if presenting at Rounds has improved as a result of the program. A pre-test before the workshops and then post-test shortly after the program, administered to residents, would also help indicate the effectiveness of TIPS for Residents. The feasibility of doing a pre-test and post-test would have to be determined with input from the stakeholders as well as considerations of budget allowances. If this would not be possible, post-test alone would help to understand the effectiveness of the program. Self-assessment data from residents would be valuable to gain insight into how they see their knowledge and skill as educators as a result of TIPS for Residents. Existing sources of data include workshop evaluations completed by residents over the past three years. In sum, data collection would be in the form of interviews with Program Directors, faculty, and medical students; pre and post-test surveys/questionnaires of residents including some self-assessment; and existing data from resident workshop evaluations. It would also be valuable to interview facilitators of the workshops; however, I am currently the primary facilitator so objectivity could be an issue. Some of the above data could also be collected with focus groups, again, with consideration for budget availability and stakeholder preference.
Analysis and Interpretation:
All stakeholders will be provided with results of the program evaluation and be able to give input into how to best interpret the results. Ultimately, Educational Support & Development would then take any suggestions and incorporate them into the planning and implementation of TIPS for Residents. Marcel D’Eon, Director of Educational Support and Development is the primary decision maker and is the person to consult in all matters pertaining to the program. Goals of the program evaluation would be to determine what is working well (so as to continue) and what needs to be modified and/or improved. In addition, Educational Support & Development would be conducting the research and collecting the data. As mentioned, some of this has already been collected in the form of the workshop evaluations. Krista Trinder is Research Assistant with Educational Support & Development and would likely have an important role in conducting the research and interpreting results. Krista is currently in the process of applying for funding for a program evaluation of TIPS for Residents so the result of that will go a long way in determining the feasibility of the above proposed measures of data collection. In addition, the methods will involve time commitments from stakeholders and medical students and, therefore, their accessibility and involvement will also be considerations. The timeline for completion of the program evaluation is also dependent on the results of the request for funding assistance.
TIPS for Residents; College of Medicine, University of Saskatchewan
Engaging Stakeholders:
TIPS for Residents is a program delivered by Educational Support & Development in the College of Medicine, University of Saskatchewan. It became a required course for residents upon the approval of the Faculty Council. Its purpose is to prepare residents to teach medical students (undergraduates) effectively. A secondary goal is to help residents to be, and feel, more confident about teaching and learning presentations they are required to do. Thus, the Undergraduate Medical Education Department is the primary stakeholder. Because of the commitment of residents in this program, postgraduate medical educators are involved and interested in the delivery and success of TIPS. Finally, the creators of TIPS for Residents, the facilitators of the workshops, and Educational Support & Development in general all have a vested interest in TIPS for Residents. Thus, the stakeholders to consider with this program include the Postgraduate Dean, Postgraduate Program Directors, Director of Undergraduate Medical Program, residents (represented by the Chief Residents), medical students, and staff and faculty in Educational Support & Development involved in creation and delivery of the program. All of the stakeholders (except for medical students) have been contacted and asked to provide input regarding the following three questions:
1. What are the primary goals and objectives of the TIPS for Residents program?
2. How would you measure/determine success in these goals/objectives?
3. What information emerging from a program evaluation of TIPS for Residents would you find to be the most valuable?
Some responses have been collected and others will continue to be gathered to help create a focus and direction for the program evaluation.
Focus:
The purpose of the program evaluation is to determine the success of TIPS for Residents and address possibilities for improvement. As such, this will be an outcomes evaluation, concentrating on goals and objectives. While the satisfaction of all stakeholders is of importance, the primary users of the program are the residents who are participants in the workshops. The direct beneficiaries are medical students who should experience more successful learning as a direct result of better resident teaching. A more complete description of the program can be found in the logic model to follow (assignment #4). The key questions to ask have to do with the residents’ knowledge of teaching and skill improvement as a result of TIPS for Residents. Are residents becoming better educators and becoming more confident in teaching as a result of TIPS for Residents? What aspects of TIPS are helping in this regard? Are there ways to alter the program to improve residents’ skill and knowledge as educators?
Data Collection:
Ideally, it would be most beneficial to discover if their teaching skills improve as a result of the program but this would be logistically extremely difficult to measure. However, if the quality of resident teaching is improving, it should be revealed by interviewing medical students and getting their feedback about strengths and weaknesses in this regard. Insight could also be provided by interviews with Program Directors and other faculty to see if presenting at Rounds has improved as a result of the program. A pre-test before the workshops and then post-test shortly after the program, administered to residents, would also help indicate the effectiveness of TIPS for Residents. The feasibility of doing a pre-test and post-test would have to be determined with input from the stakeholders as well as considerations of budget allowances. If this would not be possible, post-test alone would help to understand the effectiveness of the program. Self-assessment data from residents would be valuable to gain insight into how they see their knowledge and skill as educators as a result of TIPS for Residents. Existing sources of data include workshop evaluations completed by residents over the past three years. In sum, data collection would be in the form of interviews with Program Directors, faculty, and medical students; pre and post-test surveys/questionnaires of residents including some self-assessment; and existing data from resident workshop evaluations. It would also be valuable to interview facilitators of the workshops; however, I am currently the primary facilitator so objectivity could be an issue. Some of the above data could also be collected with focus groups, again, with consideration for budget availability and stakeholder preference.
Analysis and Interpretation:
All stakeholders will be provided with results of the program evaluation and be able to give input into how to best interpret the results. Ultimately, Educational Support & Development would then take any suggestions and incorporate them into the planning and implementation of TIPS for Residents. Marcel D’Eon, Director of Educational Support and Development is the primary decision maker and is the person to consult in all matters pertaining to the program. Goals of the program evaluation would be to determine what is working well (so as to continue) and what needs to be modified and/or improved. In addition, Educational Support & Development would be conducting the research and collecting the data. As mentioned, some of this has already been collected in the form of the workshop evaluations. Krista Trinder is Research Assistant with Educational Support & Development and would likely have an important role in conducting the research and interpreting results. Krista is currently in the process of applying for funding for a program evaluation of TIPS for Residents so the result of that will go a long way in determining the feasibility of the above proposed measures of data collection. In addition, the methods will involve time commitments from stakeholders and medical students and, therefore, their accessibility and involvement will also be considerations. The timeline for completion of the program evaluation is also dependent on the results of the request for funding assistance.
Friday, September 18, 2009
ECUR 809 Assignment #2
ECUR 809 Assignment #2
The following is an assessment of possible evaluation strategies regarding Alberta Education’s ECS Programming for Children with Severe Disabilities.
Given the ECS program’s description, I believe the primary evaluation strategy would be an outcome-based or goal oriented, summative evaluation. The ultimate purpose of this program is that it “must meet the child’s needs”; a very broad objective but one that would be paramount in a program such as this. Michael Scriven’s model would be appropriate in that the reasons for the program need to be identified. The effectiveness of the program would be measured based on how well the primary goals have been achieved. Ideally, this evaluation would start by identifying just what some of the specific needs of these children are, obviously followed by an assessment of how well they were being met.
This program has children with severe/profound disabilities so, ideally, a pre-test and post-test would reveal just how well their needs have been met. If this is unavailable, I believe much could be learned by a post-test only. Measurement tools might be changes in skills, behaviours, and learning while the children were in the program. Since the program is three years long maximum, children could also be monitored for changes once leaving the program to give further indications of its effectiveness. In addition to the children themselves, much information may be gathered through interviews with other care providers like social workers or those in health services who might be involved in the child’s care. The primary source of information as to whether or not the needs of these children are being met would ultimately come from the parents or principal care givers. Knowing the child the best, these people would be in the best position to reflect on whether the program was indeed providing the benefits (i.e. meeting the child’s needs) that they require.
I would not rule out a formative or process evaluation but I think the summative one would be valuable to do first to help guide the direction of the formative one. If interviews with primary care givers reveal that some facets of the program are inadequate, it would help immensely to direct the issues focused on in the formative evaluation. For example, a process evaluation could look at things like the allocation of in-home vs center-based services, time of average home visits (1.5 hours), number of home visits per year (minimum of four), age criteria for eligibility, and criteria for assessing the child’s current level of functioning. In fact, this type of program would lend itself quite well to a combination of summative and formative evaluation where the outcomes are assessed and, at the same time, possible improvements in the process are identified as well. This evaluation would be time consuming as much of the information would be attained through personal interviews but it would be all the richer for the insights gleaned from this approach. Practical solutions to any potential problems could result directly from the involvement of the people in the best position to analyze how the program is working for, or worked for, the child in their care.
The following is an assessment of possible evaluation strategies regarding Alberta Education’s ECS Programming for Children with Severe Disabilities.
Given the ECS program’s description, I believe the primary evaluation strategy would be an outcome-based or goal oriented, summative evaluation. The ultimate purpose of this program is that it “must meet the child’s needs”; a very broad objective but one that would be paramount in a program such as this. Michael Scriven’s model would be appropriate in that the reasons for the program need to be identified. The effectiveness of the program would be measured based on how well the primary goals have been achieved. Ideally, this evaluation would start by identifying just what some of the specific needs of these children are, obviously followed by an assessment of how well they were being met.
This program has children with severe/profound disabilities so, ideally, a pre-test and post-test would reveal just how well their needs have been met. If this is unavailable, I believe much could be learned by a post-test only. Measurement tools might be changes in skills, behaviours, and learning while the children were in the program. Since the program is three years long maximum, children could also be monitored for changes once leaving the program to give further indications of its effectiveness. In addition to the children themselves, much information may be gathered through interviews with other care providers like social workers or those in health services who might be involved in the child’s care. The primary source of information as to whether or not the needs of these children are being met would ultimately come from the parents or principal care givers. Knowing the child the best, these people would be in the best position to reflect on whether the program was indeed providing the benefits (i.e. meeting the child’s needs) that they require.
I would not rule out a formative or process evaluation but I think the summative one would be valuable to do first to help guide the direction of the formative one. If interviews with primary care givers reveal that some facets of the program are inadequate, it would help immensely to direct the issues focused on in the formative evaluation. For example, a process evaluation could look at things like the allocation of in-home vs center-based services, time of average home visits (1.5 hours), number of home visits per year (minimum of four), age criteria for eligibility, and criteria for assessing the child’s current level of functioning. In fact, this type of program would lend itself quite well to a combination of summative and formative evaluation where the outcomes are assessed and, at the same time, possible improvements in the process are identified as well. This evaluation would be time consuming as much of the information would be attained through personal interviews but it would be all the richer for the insights gleaned from this approach. Practical solutions to any potential problems could result directly from the involvement of the people in the best position to analyze how the program is working for, or worked for, the child in their care.
Thursday, September 10, 2009
ECUR 809 Assignment #1
ECUR 809: Assignment #1
Program Evaluation Summary
The following summary pertains to the North Carolina General Assembly’s program evaluation regarding controlling the cost of Medicaid private duty nursing services, Dec. 2008
http://www.ncleg.net/PED/Reports/documents/PDN/PDN_Report.pdf
The Program Evaluation Division’s report on controlling private duty nursing services costs in North Carolina is, as the name suggests, a cost-effectiveness evaluation. The impetus for the evaluation was a response to the fact that the number of recipients receiving private duty nursing, and the costs of their care, had outpaced the growth of Medicaid from 2003-04 to 2006-07. North Carolina Medicaid funds private duty nursing benefits and the goal of this program evaluation was to determine the cost savings of alternatives.
The evaluators used extensive sources for data in their study including Medicaid expense records, both federal and from other states, as well as interviews with private duty nursing recipients. The former gave the study a solid quantitative grounding. For example, the cost of Medicaid spent on recipients of private duty nursing was compared with the amount spent on residents of nursing facilities. The total costs, costs per recipient, and number of recipients were compared from 2004 to 2007. Trends were used to create projections for future years, illustrating the need for cost controls. Interviews provided a qualitative dimension, although too few were conducted (ten recipients and/or their families). Interviewees provided reasons why they chose to receive private duty nursing services.
Twenty states pay for private duty nursing for adults under their state care and North Carolina is one of only two that do not set limits on benefits. Therefore, the evaluators used cost-containment mechanisms of these other states to help guide them in their recommendations for North Carolina. Furthermore, the Division of Medical Assistance had created suggestions for cutting costs, including establishing clearer and more objective criteria for evaluating recipient need which the North Carolina evaluators used in their recommendations.
While the North Carolina evaluation is comprehensive, looking at past results, projections, and possible solutions, it certainly has its limitations. Some of the cost-containment recommendations, although intuitively reasonable, had no quantitative justification. Since private duty nursing costs were combined with other health care expenditures, there was no way to tease out the actual cost savings of several mechanisms. Similarly, the evaluation noted two potential conflict of interest reasons why recipients might receive private duty nursing services longer than they might need, or perhaps not need at all. First, physician’s care for their patients may bias them to recommend licensed nursing services more frequently than necessary. Secondly, the recipient’s home care agency, which has a financial stake in renewing care, may influence decisions on continued care. While the authors of the evaluation are likely correct that these forms of patient assessment (the Private Duty Nursing Team is dependent on these external assessments) are problematic and some form of independent assessment would be better, they show no evidence that the decisions of physicians and home care agencies are biased. Because this would be extremely difficult to measure, the potential cost savings of hiring independent assessors is impossible for these evaluators to measure.
A couple final problems are evident. Although it is understandable that a Medicaid program would look to other Medicaid programs in the United States for guidance on cost savings, potential solutions to the problem will be missed by not looking at other medical systems worldwide. American health issues are not unique and perhaps the evaluation could have been more insightful by looking at health care systems in a few other countries with similar structures. Finally, the recommendations essentially boil down to reducing the number of recipients using private duty nursing services. The sparse qualitative data mentioned in the evaluation notes that recipients chose to receive private duty nursing services due to better quality of life at home and better care than in nursing facilities. This indicates that perhaps decreasing the number of recipients might not be the best tack. Where human health is concerned, the bottom-line approach can be dehumanizing and, in this case, narrows the focus on possible solutions to reducing costs. If quality of life was given a more prominent status, perhaps more attention could have been directed toward finding ways to reduce costs elsewhere in the medical system.
Program Evaluation Summary
The following summary pertains to the North Carolina General Assembly’s program evaluation regarding controlling the cost of Medicaid private duty nursing services, Dec. 2008
http://www.ncleg.net/PED/Reports/documents/PDN/PDN_Report.pdf
The Program Evaluation Division’s report on controlling private duty nursing services costs in North Carolina is, as the name suggests, a cost-effectiveness evaluation. The impetus for the evaluation was a response to the fact that the number of recipients receiving private duty nursing, and the costs of their care, had outpaced the growth of Medicaid from 2003-04 to 2006-07. North Carolina Medicaid funds private duty nursing benefits and the goal of this program evaluation was to determine the cost savings of alternatives.
The evaluators used extensive sources for data in their study including Medicaid expense records, both federal and from other states, as well as interviews with private duty nursing recipients. The former gave the study a solid quantitative grounding. For example, the cost of Medicaid spent on recipients of private duty nursing was compared with the amount spent on residents of nursing facilities. The total costs, costs per recipient, and number of recipients were compared from 2004 to 2007. Trends were used to create projections for future years, illustrating the need for cost controls. Interviews provided a qualitative dimension, although too few were conducted (ten recipients and/or their families). Interviewees provided reasons why they chose to receive private duty nursing services.
Twenty states pay for private duty nursing for adults under their state care and North Carolina is one of only two that do not set limits on benefits. Therefore, the evaluators used cost-containment mechanisms of these other states to help guide them in their recommendations for North Carolina. Furthermore, the Division of Medical Assistance had created suggestions for cutting costs, including establishing clearer and more objective criteria for evaluating recipient need which the North Carolina evaluators used in their recommendations.
While the North Carolina evaluation is comprehensive, looking at past results, projections, and possible solutions, it certainly has its limitations. Some of the cost-containment recommendations, although intuitively reasonable, had no quantitative justification. Since private duty nursing costs were combined with other health care expenditures, there was no way to tease out the actual cost savings of several mechanisms. Similarly, the evaluation noted two potential conflict of interest reasons why recipients might receive private duty nursing services longer than they might need, or perhaps not need at all. First, physician’s care for their patients may bias them to recommend licensed nursing services more frequently than necessary. Secondly, the recipient’s home care agency, which has a financial stake in renewing care, may influence decisions on continued care. While the authors of the evaluation are likely correct that these forms of patient assessment (the Private Duty Nursing Team is dependent on these external assessments) are problematic and some form of independent assessment would be better, they show no evidence that the decisions of physicians and home care agencies are biased. Because this would be extremely difficult to measure, the potential cost savings of hiring independent assessors is impossible for these evaluators to measure.
A couple final problems are evident. Although it is understandable that a Medicaid program would look to other Medicaid programs in the United States for guidance on cost savings, potential solutions to the problem will be missed by not looking at other medical systems worldwide. American health issues are not unique and perhaps the evaluation could have been more insightful by looking at health care systems in a few other countries with similar structures. Finally, the recommendations essentially boil down to reducing the number of recipients using private duty nursing services. The sparse qualitative data mentioned in the evaluation notes that recipients chose to receive private duty nursing services due to better quality of life at home and better care than in nursing facilities. This indicates that perhaps decreasing the number of recipients might not be the best tack. Where human health is concerned, the bottom-line approach can be dehumanizing and, in this case, narrows the focus on possible solutions to reducing costs. If quality of life was given a more prominent status, perhaps more attention could have been directed toward finding ways to reduce costs elsewhere in the medical system.
Subscribe to:
Posts (Atom)