Hi everyone
V2 of my draft evaluation plan is now on google docs http://docs.google.com/Doc?id=dcxrtrqz_2gbxxwchq . Have discussed my first draft with Bronwyn and made amendments, but the questionnaires are new. So keen to have feedback on them, and any other general feedback.
thanks, and now will be able to get around to giving you feedback on your drafts.
bye for now
Michelle
Sunday, May 17, 2009
Saturday, May 2, 2009
Weeks 7 and 8 - Draft of evaluation plan
Hi there,
Here is a draft of my evaluation plan. Would appreciate any feedback or suggestions.
Draft of evaluation plan
Introduction
This document describes the evaluation plan of a planned online software training course for staff at Manukau Institute of Technology (MIT). Software courses taught may include Microsoft Office applications, eg Word, Excel, PowerPoint, Outlook, Publisher, Visio, and some MIT web-based applications, eg the portal, the intranet site, etc. It will be developed mainly by the Staff Computer Trainer, who is working in conjunction with the Manager of MIT Short Courses. Together they are looking at developing online software training, the MIT Software Trainer for the staff at MIT, and the Manager of MIT Short Courses for industry training. The evaluation, which is a Needs Analysis, will be conducted in May 2009. This evaluation is concerned with the online training for MIT staff.
The Needs Analysis is based on the Eclectic-Mixed Methods-Pragmatic Paradigm, using the Mixed Methods Evaluation Model.
Background
The MIT Staff Software trainer currently runs face to face courses and one on one training on Microsoft Office applications and some MIT web-based applications. She believes that there is a need for online training to cater for those learners who:
· Cannot attend face to face courses due to timetabling constrictions;
· Prefer self-paced learning;
· Possibly to provide blended learning to complement the face to face training.
Purpose:
The purpose of this Needs Analysis is to obtain accurate information that confirms that there is, in fact, a need for this type of elearning, and to optimise the design and delivery of the course by establishing the needs of the learners in terms of content and learning styles – leading to the types of activities that will engage them.
Limitations:
One limitation could be the time involved in carrying out this evaluation, on top of normal (already heavy) workload.
Audiences
The Needs Analysis will be carried out on a cross-section of MIT staff. Experts will also be involved in the evaluation process, including the Manager – MIT Computer Short Courses; the Learning Technology Centre Manager; the Learning Technology Centre Advisor, the Software Support Manager.
Decisions:
I cannot foresee any negative outcome of this evaluation. From informal questioning, there does appear to be an interest in this elearning course, and I expect this evaluation to confirm that, and also to give constructive feedback as to the content and design of the course.
Questions:
These questions have been taken from the Elearning Guidelines for New Zealand:
SD2: Do students have any choice in terms of what they learn, the particular resources they will study and/or the learning activities they will engage in?
i. How will students like to use this elearning course? eg self-paced learning, blended delivery,
ii. What resources will best engage the learner, whilst providing for maximum learning outcomes eg audio visual demonstrations, text-based instructions, interactive activities, formative and/or summative assessments (online tests)?
iii. Which software training courses will best be suited to online training?
iv. How will I monitor what is working well and what isn’t (related to the design of the modules)?
ST1: Do you have a way to identify student needs and respond to them?
i. Are the staff at MIT interested in online software training?
ii. Of those that are interested in online software training, are they familiar with Emit (Blackboard). If yes, do they like using Blackboard; if no, how will training on the use of Blackboard be provided?
iii. What is an effective method(s) for receiving constructive feedback from the learners on their needs, eg what further online training they would like?
iv. What systems will be used to ensure that the learner is learning at an appropriate level, and relevant to the needs of their role?
Methods:
I will be using the Multiple Methods Evaluation model, which will allow me to use different methods of evaluating individual aspects (triangulation) and evaluates a range of factors within each aspect (bracketing). This method is well suited to the complexity of instructional design. (here I will insert a table which demonstrates the relationships between the Questions, and the methods used to evaluate them)
Sample:
The participants involved in this evaluation will be:
· A random selection of MIT staff (volunteers)
· Manager, Learning Technology Centre, and Learning Advisor, Learning Technology Centre
· My manager – Software Support User Manager
· Manager, MIT Computer Short Courses
· Other MIT lecturers and trainers with an interest in eLearning
Instrumentation
· A sample learning module to be prepared on Emit for testing by focus group
· A usability questionnaire to be prepared for staff who are involved in the focus group and testing of the learning module, in order to evaluate the design of the module, and what resources/activities best hold the learners focus and meet learning outcomes.
· A questionnaire to be prepared for MIT staff
Logistics:
I will be responsible for the implementation, analysis and reporting of the evaluation
Time Line
Implementation – 1 week
Analysis – 1 week
Reporting – 1 day
Budget
Captivate software for creating audio visual demonstrations - $200
My time
Appendices
A. Information sheet for participant (yet to be created)
B. Questionnaire - Usability of learning module (yet to be created)
C. Questionnaire – general questionnaire on the viability of online learning for software training (yet to be created
Here is a draft of my evaluation plan. Would appreciate any feedback or suggestions.
Draft of evaluation plan
Introduction
This document describes the evaluation plan of a planned online software training course for staff at Manukau Institute of Technology (MIT). Software courses taught may include Microsoft Office applications, eg Word, Excel, PowerPoint, Outlook, Publisher, Visio, and some MIT web-based applications, eg the portal, the intranet site, etc. It will be developed mainly by the Staff Computer Trainer, who is working in conjunction with the Manager of MIT Short Courses. Together they are looking at developing online software training, the MIT Software Trainer for the staff at MIT, and the Manager of MIT Short Courses for industry training. The evaluation, which is a Needs Analysis, will be conducted in May 2009. This evaluation is concerned with the online training for MIT staff.
The Needs Analysis is based on the Eclectic-Mixed Methods-Pragmatic Paradigm, using the Mixed Methods Evaluation Model.
Background
The MIT Staff Software trainer currently runs face to face courses and one on one training on Microsoft Office applications and some MIT web-based applications. She believes that there is a need for online training to cater for those learners who:
· Cannot attend face to face courses due to timetabling constrictions;
· Prefer self-paced learning;
· Possibly to provide blended learning to complement the face to face training.
Purpose:
The purpose of this Needs Analysis is to obtain accurate information that confirms that there is, in fact, a need for this type of elearning, and to optimise the design and delivery of the course by establishing the needs of the learners in terms of content and learning styles – leading to the types of activities that will engage them.
Limitations:
One limitation could be the time involved in carrying out this evaluation, on top of normal (already heavy) workload.
Audiences
The Needs Analysis will be carried out on a cross-section of MIT staff. Experts will also be involved in the evaluation process, including the Manager – MIT Computer Short Courses; the Learning Technology Centre Manager; the Learning Technology Centre Advisor, the Software Support Manager.
Decisions:
I cannot foresee any negative outcome of this evaluation. From informal questioning, there does appear to be an interest in this elearning course, and I expect this evaluation to confirm that, and also to give constructive feedback as to the content and design of the course.
Questions:
These questions have been taken from the Elearning Guidelines for New Zealand:
SD2: Do students have any choice in terms of what they learn, the particular resources they will study and/or the learning activities they will engage in?
i. How will students like to use this elearning course? eg self-paced learning, blended delivery,
ii. What resources will best engage the learner, whilst providing for maximum learning outcomes eg audio visual demonstrations, text-based instructions, interactive activities, formative and/or summative assessments (online tests)?
iii. Which software training courses will best be suited to online training?
iv. How will I monitor what is working well and what isn’t (related to the design of the modules)?
ST1: Do you have a way to identify student needs and respond to them?
i. Are the staff at MIT interested in online software training?
ii. Of those that are interested in online software training, are they familiar with Emit (Blackboard). If yes, do they like using Blackboard; if no, how will training on the use of Blackboard be provided?
iii. What is an effective method(s) for receiving constructive feedback from the learners on their needs, eg what further online training they would like?
iv. What systems will be used to ensure that the learner is learning at an appropriate level, and relevant to the needs of their role?
Methods:
I will be using the Multiple Methods Evaluation model, which will allow me to use different methods of evaluating individual aspects (triangulation) and evaluates a range of factors within each aspect (bracketing). This method is well suited to the complexity of instructional design. (here I will insert a table which demonstrates the relationships between the Questions, and the methods used to evaluate them)
Sample:
The participants involved in this evaluation will be:
· A random selection of MIT staff (volunteers)
· Manager, Learning Technology Centre, and Learning Advisor, Learning Technology Centre
· My manager – Software Support User Manager
· Manager, MIT Computer Short Courses
· Other MIT lecturers and trainers with an interest in eLearning
Instrumentation
· A sample learning module to be prepared on Emit for testing by focus group
· A usability questionnaire to be prepared for staff who are involved in the focus group and testing of the learning module, in order to evaluate the design of the module, and what resources/activities best hold the learners focus and meet learning outcomes.
· A questionnaire to be prepared for MIT staff
Logistics:
I will be responsible for the implementation, analysis and reporting of the evaluation
Time Line
Implementation – 1 week
Analysis – 1 week
Reporting – 1 day
Budget
Captivate software for creating audio visual demonstrations - $200
My time
Appendices
A. Information sheet for participant (yet to be created)
B. Questionnaire - Usability of learning module (yet to be created)
C. Questionnaire – general questionnaire on the viability of online learning for software training (yet to be created
Subscribe to:
Posts (Atom)