Monday, July 6, 2009

Summary of Results

Here is a summary of my results. My v2 draft complete with Bronwyn's comments is on http://docs.google.com/View?id=dcxrtrqz_25gxbtp5fw if you are interested in viewing it.

Summary of Expert Review – Questionnaire and Focus Group discussion
Overall, the feedback was positive. On the whole, the results showed that the module was well designed and the content and resources engaged the learner and would enable learners to meet their objectives. There were a couple of minor hitches, for example, the Excel workbook didn’t fully maximise, therefore the second sheet tab with “model answers” wasn’t visible. Good to get this feedback at this stage so that it could be easily rectified.

The audio visual demonstration and the Excel interactive exercise were most helpful to learning, and it was felt that this design could be used for a variety of online learning topics. Self-paced learning was considered the best use of the learning module.

Improvements could be made to this learning module by providing more practice exercises, making instruction concise, simple, and very obvious.

It was recommended that video demonstrations be created with and without audio to allow easier access to those who may not have sound on their computers (eg if they’re doing the course from home)

Currently a hard copy of a Training Needs Analysis is used to recommend training for new staff during their induction training. Training Needs Analysis is only carried out on existing staff if requested by staff member or manager.

The focus group recommended that an online Training Needs Analysis be used in place of the hard copy of the needs analysis that was currently used. An online needs analysis would be more accessible and more effective than the current hard copy. This could easily be created in WimbaCreate and possibly placed on the intranet site and/or eMIT.For staff enquiring about or booking on computer courses, a short online pre-test could be made available.

Summary of MIT Staff Questionnaire
The majority of respondents 83% responded that online software training would increase their skill set related to using computer software. Although the majority (72%) were familiar with using a Learning Management System, 56% indicated they had no preference either way. This would require follow-up once they have been exposed to the Learning Management System.

As shown in Table 5, the majority of respondents (61%) showed a preference for self-paced learning. However, 56% respondents also showed a preference for blended learning. Given the difficulty these staff have in attending face to face classes during their work time, I assume that these staff would attend the blended learning courses outside of their normal work hours (some of these staff would not work regular Monday to Friday 9-5 hours).

Audio visual demonstrations and interactive exercises were considered the most helpful resource to their learning. Microsoft Office applications was most mentioned as the software course best suited to online training, although other applications were mentioned, such as portal training, and Adobe products.

Comments from respondents indicated that the majority were interested in the flexibility of self-paced learning – the ability to learn in their own time, where and when they wanted to.

Thursday, July 2, 2009

Draft Result of evaluation

Please click on this link http://docs.google.com/View?id=dcxrtrqz_20g4hkkddv to see the draft results of my evaluation.

(Thanks Pradeep for bringing to my attention that I hadn't put this link in my blog. )

Bronwyn has now added her comments.

Would welcome any further comments.

Thanks,
Michelle

Tuesday, June 2, 2009

Initial thoughts after evaluation

Findings from the Expert Review of the Sample Learning Unit
I think that the Expert Review was the most valuable feedback that I received. The experts (7) sampled the learning module and then completed an evaluation on Blackboard. They had a 15 minute break while I reviewed the results and then as a group we had further discussions to clarify and expand the ideas.

One aspect that was highlighted was that many people don't read instructions on the internet, or skim there way through. A couple of people rushed through the module and didn't even notice some of my activities, eg video demo! So instructions need to be kept to a minimum, and those instructions should be VERY obvious, brief and simple.

Another topic that was well discussed was the use of audio in the learning module. I thought I had prepared for this by adding text captions, but apparently if the computer doesn't have sound, then the video demo doesn't open at all. Although all MIT staff do have sounds on their computers, some may be accessing the learning model from home. Not sure if this issue is just an MIT issue, or maybe related to the software that they are using (they used Camtasia, I am using Captivate at present) Will have to check this out further.

I am quite keen to use audio in my learning module, as research shows that this does add to the learning experience, and caters for different learning styles. Also, 83% of those surveyed on my staff questionnaire preferred audio visual demonstrations. One option suggested was to create one video demo with sound and one without, then staff could select the appropriate medium. If any of you had any experience with this, would appreciate any feedback.

See results of the Expert Review of the Sample Learning Unit on http://spreadsheets.google.com/ccc?key=r19WdTxDb12ig_9iM7S931A

Another question that I brought to this focus group was "What method should be used to find out the training needs of the staff?

I currently use a hard copy of a Training Needs Analysis to recommend training for new staff during their induction training. Training Needs Analysis is only carried out on existing staff if requested by staff member or manager.

After much discussion it was agreed that an Online Training Needs Analysis would fit the bill. Could easily be created in Wimba Create and possibly placed in Intranet site, alongside Health & Safety online quizz (for new staff).

For staff booking on computer courses, a short pretest could be emailed to them when they register, to check that they are at the right level for this course, or to recommend another course - this could easily be done in Wimba Create.

Could also put TNA on Blackboard (staff could self-enrol).


MIT Staff Questionnaire
This questionnaire was completed by 18 staff consisting of FreeB Community Facilitators and Call Centre staff. There were no surprises here, the results were what I expected. The questionnaire was done in SurveyMonkey and I was really impressed with how easy it was to use and download results.

I used a 5 scale Likert questionnaire. On one question "I would need training in the use of an LMS (Blackboard) 10 out of 18 responded "Neutral" - not very helpful, left me wondering. On reflection I would choose a 4 level scale in future.

The preference was for self paced learning 61%, followed by blended delivery 56%. This was to be expected with the group of staff that I chose - they are from offsite locations or in roles where they not easily attend face to face training.

The resources that these staff thought would be most useful to their learning were audio visual demo's, and interactive exercises.

Results of survey taken by 18 MIT staff, consisting of FreeB Community Facilitators and Call Centre staff can be viewed on http://spreadsheets.google.com/ccc?key=rar-Fhxk14bjim7ebihrEAg.


MIT-wide survey
As part of a wider survey, 35 selected MIT staff (managers or their representatives across all departments)were asked if they had any interest in online software learning for themselves or their staff. 26 said "yes", 1 said "no", and 4 said "maybe/unsure"

Presentation of Data
I am thinking of presenting the data in graph format because I think that this visual is more effective than reading figures. They would be a combination of either bar charts or pie charts depending on the data.

All for now,

Michelle

Monday, June 1, 2009

Final evaluation plan submitted

My final evaluation plan can be viewedon http://docs.google.com/View?id=dcxrtrqz_3c5cm8ff6 . This has been submitted for marking.

I have incorporated the feedback received from Bronwyn on my questionnaires, and also the feedback received from Krishan to include FreeB community facilitators, who are less able to access the face to face training that is provided for staff because of their offsite locations.

Michelle

Sunday, May 17, 2009

Draft of Evaluation Plan v2

Hi everyone
V2 of my draft evaluation plan is now on google docs http://docs.google.com/Doc?id=dcxrtrqz_2gbxxwchq . Have discussed my first draft with Bronwyn and made amendments, but the questionnaires are new. So keen to have feedback on them, and any other general feedback.

thanks, and now will be able to get around to giving you feedback on your drafts.

bye for now
Michelle

Saturday, May 2, 2009

Weeks 7 and 8 - Draft of evaluation plan

Hi there,
Here is a draft of my evaluation plan. Would appreciate any feedback or suggestions.

Draft of evaluation plan
Introduction
This document describes the evaluation plan of a planned online software training course for staff at Manukau Institute of Technology (MIT). Software courses taught may include Microsoft Office applications, eg Word, Excel, PowerPoint, Outlook, Publisher, Visio, and some MIT web-based applications, eg the portal, the intranet site, etc. It will be developed mainly by the Staff Computer Trainer, who is working in conjunction with the Manager of MIT Short Courses. Together they are looking at developing online software training, the MIT Software Trainer for the staff at MIT, and the Manager of MIT Short Courses for industry training. The evaluation, which is a Needs Analysis, will be conducted in May 2009. This evaluation is concerned with the online training for MIT staff.
The Needs Analysis is based on the Eclectic-Mixed Methods-Pragmatic Paradigm, using the Mixed Methods Evaluation Model.

Background
The MIT Staff Software trainer currently runs face to face courses and one on one training on Microsoft Office applications and some MIT web-based applications. She believes that there is a need for online training to cater for those learners who:
· Cannot attend face to face courses due to timetabling constrictions;
· Prefer self-paced learning;
· Possibly to provide blended learning to complement the face to face training.

Purpose:
The purpose of this Needs Analysis is to obtain accurate information that confirms that there is, in fact, a need for this type of elearning, and to optimise the design and delivery of the course by establishing the needs of the learners in terms of content and learning styles – leading to the types of activities that will engage them.

Limitations:
One limitation could be the time involved in carrying out this evaluation, on top of normal (already heavy) workload.

Audiences
The Needs Analysis will be carried out on a cross-section of MIT staff. Experts will also be involved in the evaluation process, including the Manager – MIT Computer Short Courses; the Learning Technology Centre Manager; the Learning Technology Centre Advisor, the Software Support Manager.

Decisions:
I cannot foresee any negative outcome of this evaluation. From informal questioning, there does appear to be an interest in this elearning course, and I expect this evaluation to confirm that, and also to give constructive feedback as to the content and design of the course.

Questions:
These questions have been taken from the Elearning Guidelines for New Zealand:

SD2: Do students have any choice in terms of what they learn, the particular resources they will study and/or the learning activities they will engage in?
i. How will students like to use this elearning course? eg self-paced learning, blended delivery,
ii. What resources will best engage the learner, whilst providing for maximum learning outcomes eg audio visual demonstrations, text-based instructions, interactive activities, formative and/or summative assessments (online tests)?
iii. Which software training courses will best be suited to online training?
iv. How will I monitor what is working well and what isn’t (related to the design of the modules)?

ST1: Do you have a way to identify student needs and respond to them?
i. Are the staff at MIT interested in online software training?
ii. Of those that are interested in online software training, are they familiar with Emit (Blackboard). If yes, do they like using Blackboard; if no, how will training on the use of Blackboard be provided?
iii. What is an effective method(s) for receiving constructive feedback from the learners on their needs, eg what further online training they would like?
iv. What systems will be used to ensure that the learner is learning at an appropriate level, and relevant to the needs of their role?

Methods:
I will be using the Multiple Methods Evaluation model, which will allow me to use different methods of evaluating individual aspects (triangulation) and evaluates a range of factors within each aspect (bracketing). This method is well suited to the complexity of instructional design. (here I will insert a table which demonstrates the relationships between the Questions, and the methods used to evaluate them)


Sample:
The participants involved in this evaluation will be:
· A random selection of MIT staff (volunteers)
· Manager, Learning Technology Centre, and Learning Advisor, Learning Technology Centre
· My manager – Software Support User Manager
· Manager, MIT Computer Short Courses
· Other MIT lecturers and trainers with an interest in eLearning

Instrumentation
· A sample learning module to be prepared on Emit for testing by focus group
· A usability questionnaire to be prepared for staff who are involved in the focus group and testing of the learning module, in order to evaluate the design of the module, and what resources/activities best hold the learners focus and meet learning outcomes.
· A questionnaire to be prepared for MIT staff


Logistics:
I will be responsible for the implementation, analysis and reporting of the evaluation

Time Line
Implementation – 1 week
Analysis – 1 week
Reporting – 1 day


Budget
Captivate software for creating audio visual demonstrations - $200
My time

Appendices
A. Information sheet for participant (yet to be created)
B. Questionnaire - Usability of learning module (yet to be created)
C. Questionnaire – general questionnaire on the viability of online learning for software training (yet to be created

Saturday, April 25, 2009

Week 7 thoughts

Adrienne Moyle's comment on my Week 5 and 6 post gave me some ideas about setting up a discussion forum for my learners to gather feedback on their experiences of the course. This could be an ongoing forum - some of the feedback given may just be user training issues, which I could respond to online, and other suggestions may be design issues that I could consider and implement if appropriate. Thanks for that Adrienne.

Have just spent time going through my "list of things to do" for Weeks 7 and 8. As a result, I have revisited the eLearning Guidelines and have decided t0 change one of the guidelines that I had selected. The one that I have chosen instead of SD3, is:

SD2: Do students have any choice in terms of what they learn, the particular resources they will study and/or the learning activities they will engage in?

From this guideline, I hope to establish from my learners their needs in terms of content and learning styles - leading to the types of activities that will engage them.


Have retained ST1 guideline:

ST1: Do you have a way to identify student needs and respond to them?
The Needs Analysis will hopefully confirm that providing online software training for the staff at MIT will improve the options for learning (catering for different learning styles) and meet the needs of the learners. What I want to establish here is that there is a need for an online learning course for software training within the MIT staff.

Thanks Bronwyn for highlighting your feedback on Catherine's blog. This was very useful, and I will work on my sub-guidelines shortly.

Found the example of the evaluation plan , and also the evaluation tools very useful. Was unable to access theEvaluation Cookbook, but will try again tomorrow.

So working on my draft plan next - any feedback would be appreciated.

Thursday, April 16, 2009

Weeks five and six: Evaluation methods

To recap, as I do not have an online course at present, I will be conducting a Needs Analysis to evaluate the idea of creating online software training for the staff at MIT, ie Microsoft Word, Excel, PowerPoint and other web-based programmes that we use at MIT. I intend using Blackboard, as this is the learning platform used at MIT. I want to ensure that the learning modules that are developed do meet the needs of our learners, and this will be a big part of the Needs Analysis, finding out from the learners, what their needs are - how they best learn. From informal discussions that I have had with staff, it appears that there is definite interest and different users would use this in different ways. So my initial thoughts are to provide for different needs and learning styles by providing a variety of learning strategies. But let's not jump the gun.... hopefully the Needs Analysis will confirm my thinking.



As noted in my prior post, have decided to use the Multiple Methods Evaluation model (triangulation and bracketing), which will give me the flexibility and variety to be able to evaluate the many different aspects and also the diverse needs of the learners (Hogarty, 2008).

The paradigm that seems to fit well with this evaluation model and eLearning, is the Eclectic-Mixed Methods-Pragmatic Paradigm discussed in Reeves' Educational Paradigms article (1996).

The Guidelines that I have chosen to work with in this evaluation project are:

SD3 From Guidelines Wiki
Do students gain knowledge relevant to employment and/or current thinking in their field?What I want to evaluate here, is that my learners (the staff at MIT) have an appropriate computing skill set to be able to carry out their job effectively.

ST1 From GuidelinesWiki
Do you have a way to identify student needs and respond to them?
The Needs Analysis will hopefully confirm that providing online software training for the staff at MIT will improve the opportunities for learning and meet the needs of the learners.

Have been reading Chapter 6 "Conducting a Needs Analysis" from Interactive Learning Systems - Reeves, Thomas & Heldberg (2003). Reeves et al says "The overall purpose of needs assessment is to provide information to guide decisions about aligning an interactive learning system with important needs of specific audiences". This seems to fit well with what I am trying to achieve.

Of particular interest was a "Checklist for Evaluating an Elearning Program or Courseware" which looks at the Content and the Instructional Design of the course. This will be a useful guide when I am looking at developing elearning modules.

Kaufman, Rojas & Mayer (1993) state that "The bottom line of the value of needs assessment is that it should provide the data needed to support the next level of design decisions so that the project moves forward to prototyping and formative evaluation activities". Sounds good to me!


The types of evaluation I intend to use are:

  • Interviews

  • Questionnaires

  • Focus groups

  • Checklists


References:

Hegarty, Bronwyn (2008), Types of Evaluation Models, downloaded from http://wikieducator.org/images/b/b6/Types_of_Evaluation_Models08.pdf


Kaufman, R.A., Rojas, A.M., & Mayer (1993). Needs Assessment: A user's guide. Englewood Cliffs, NJ: Educational Technology Publications.

Reeves, T. & Hedberg, J. (2003). Interactive Learning Systems Evaluation. Educational Technology Publications, Englewood Cliffs, New Jersey

Sunday, March 22, 2009

Week 4: Evaluation Paradigms and Models

I think that the Multiple Methods Evaluation model would work best for me, with the subject that I am evaluating.

My understanding of the Multiple Methods Evaluation model is that is uses different methods of evaluating individual aspects (triangulation) and and evaluates a range of factors within each aspect (bracketing).

The Muliple Methods evaluation model comes under the umbrella of the Eclectic-Mixed Methods-Pragmatic paradigm.

The reason that I chosen this model is because of the flexibility of being able to use different methods evaluation and also the depth (and therefore quality) of the evaluation.

eLearning - is it andragogically sound?

I mentioned on our online chat last week, an article from Robert John Muirhead (2007) "E-learning: Is this Teaching at Students or Teaching with Students" (Sam originally referred me to this article).

The article argues that eLearning is mainly not student-based learning but is more of a pedagogical than an andragogical approach - limiting the learner's experience to that which is planned and set out by the teacher.

Muirhead quotes Knowles and Associates (1984) who says that the four concepts that separate androgogy from pedagogy are:

"(a) adults need to be involved in the planning and evaluation of their instruction;
(b) adult experiences provide the basis for learning activities;
(c) adults are most intersted in learning subjects that have immediate relavance to their job or pesonal life, and
(d) adult learning is problem-centred rather than context-oriented"

Muirhead says that the exception to this, is the discussion board, which lends more towards an androgogical approach.

How can we design our eLearning courses so that they are more pedagogically sound? Evaluation would play an important part in this. It is critical that a Needs Analysis is conducted to ensure that adults are involved in the planning and evaluation of their instruction. This would include the learning styles of the learner, their prior knowledge and whether eLearning is the most appropriate medium for the course - to name a few.

With eLearning design, I think that extra effort has to be made by the teacher to be as flexible as possible in the delivery of the course. Obviously the content is prepared in advance, but whereas in a f2f classroom the teacher is easily able to adapt the session to the needs of the students at a moment's notice, this is not so easy in an online environment. This is where discussion boards and group work are invaluable and where critical thinking and creativity can be fostered.

What are your thoughts?

Michelle


References:
Knowles, M., and Associates. (1984). Andragogy in action: Applying modern principles of adult education. San Francisco: Jossey-Bass

Muirhead, RJ. (2007). E-learning: Is this Teaching at Students or Teaching with Students? in Nursing Forum Volume 42, No. 4, October-December 2007.

Wednesday, March 11, 2009

Week 3: eLearning guidelines for quality

I have spent a lot of time firstly, figuring out what I wanted to evaluate, and then trying to work out which Guidelines would work best. Am still not sure that I have got it right, but here's a start!

Part of my job as the Staff Software Trainer at MIT is conducting one on one IT induction sessions for new staff. This is generally a 1-2 hour session that covers a basic introduction to our systems, including: file management, email, websites such as intranet, portal, etc. and telephone/voicemail, and concludes with a Training Needs Analysis.

Issue 1:
One of the things that I would like to explore is investigating the use of an online Training Needs Analysis, instead of conducting it myself with the staff member during the session. Currently this is part of the one on one training that I carry out within the IT induction session, which involves questioning the learner and asking them to demonstrate skills and then discussing what level of skill is required in their position.

Am undecided if this is the best way to go.

Advantages are flexibility of time (sometimes limited to time for session (either on staff members part or on my part) and can be rushed), more flexible for staff in that any staff member (existing or new staff) could carry out their own TNA at any time and in their own time.

Disadvantages could be staff with low computer literacy may not be able to easily complete (which then gives me a good idea of their skill level!)

My question would be - would the learner benefit from an online Training Needs Analysis? Sounds very simple - but obviously not much point if its not going be an improvement from the current method.



Issue 2:
Another aspect that I would like to look at is a more centralised and easily accessible method of offering follow-up support and resources.

Currently have some resources on the Y: shared drive and also on MITnet (our intranet site) and would like to make these more central and easily accessible.

Am considering putting all resources onto Blackboard (eMIT), with the long term goal of setting up online learning (but for the purposes of this evaluation will stick to the shorter term goals mentioned).

I think that the eLearning Guidelines that would fit with these issues are:

SD3
From Guidelines Wiki
Do students gain knowledge relevant to employment and/or current thinking in their field?
In this case, my students are the staff at MIT, and the relevance to employment is the job that they have just started at MIT.

Depending on their role, they have differing needs, eg a cleaner would not need the same level of IT literacy as say, an Administration Manager. It is sometimes difficult to ascertain, when speaking to a new staff member, exactly what training is required, for example, if they will be using Excel and if yes, do they need to know how to create formulas, or do they just need to know how to enter data on an existing worksheet. The job description does not give this detail, and as a new staff member they are sometimes not sure to what level they will be working. So, it is important that the teaching is at the level of the learner and related to their needs depending on their role.

ST1
From GuidelinesWiki
Do you have a way to identify student needs and respond to them?
New staff at MIT have a wide range of IT literacy levels from very low to very high. It would be useful to be able to get some idea of their needs prior to the session (although this wouldn't always be possible). This would give me the flexibility to be able to adapt the session more to the needs of the learner, and also to plan future training sessions.

Look forward to your comments/suggestions.

Tuesday, March 10, 2009

Week 2: Quality and Evaluation

Why is evaluation important to me, and how do I define it?
Evaluation is important because without it, we have no way of knowing whether what we are teaching is being learnt successfully and effectively. And if what we are trying to teach is not being learnt, then we are wasting our time and our students’ time.
To me, evaluation is knowing that:
• I am teaching the right subject matter, ie the content matches the learning outcomes;
• The teaching methodology that I am using is best suited to my learners and to the content that I am teaching
• The course is interesting to the learners – fun, interactive, caters for different learning styles.
• I am using effective teaching resources and activities.
• The assessments that I use work effectively alongside the course content, and match the learning outcomes.
My definition of evaluation is “to be able to identify important aspects of instructional design and their effectiveness”.

What sort of evaluations mentioned on the presentation are familiar to you already and why?
As a learner on this Graduate Certificate of eLearning have had experience with giving and receiving feedback via discussion forums, eg peer evaluation.
Many courses that I have attended as a learner have questionnaires, either online or hard copy to provide evaluations of the course.
Some types of evaluation that I have used when I have been evaluating my courses, have been:
• Training Needs Analysis – analysis of learners’ current skills and what further training they need in order to work more effectively. This enables me to gauge what courses / content I need to create and deliver.
• Observation – for new courses - I have run “test” courses for my peers for the purpose of getting constructive feedback on course content, resources and delivery.
• Questionnaires – written feedback after completion of a course to determine level of learner satisfaction.
• Self-evaluation – after delivering a course I review what went well and why, what didn’t go so well and why; and what can I do better next time

Why is quality important in eLearning?
Because of the isolation of an eLearner, it is imperative that good quality formative assessments or tasks are in place so that the learner and the teacher are confident that they are on the right track
In eLearning, there is no face to face teacher to give immediate feedback, explanation or clarification, so it is imperative that the course material and activities are of a high quality, is easily located and instructions are clear and easily understood. Difficulty in accessing the course material, or unclear instructions could lead to students becoming frustrated, unmotivated and losing interest in the course.

It is also important to build in support for the eLearner – that would include building a sense of community with other learners, and easy access to the teacher and IT support.

Sunday, March 1, 2009

Hi everyone

This is my first posting - just a quick one to get things going.

I have worked fulltime at Manukau Iinstitute of Tecbnology for 8 1/2 years and my current position is as the staff software trainer.

I have done three of the GradCert in eLearning courses - the last one a couple of years ago. So I'm back on track now and keen to complete the rest.

In my role, I am looking at creating online software training for staff at MIT so this course will be invaluable, I am sure.

I do enjoy learning, and am also doing my National Certicate in Adult Literacy (Education) via distance learning, as well as studying part- time towards my Bachelor of Applied Communication.

It's great to see so many MIT staff on this course - looking forward to working with you, and and others enrolled - look like we have a good group with experience in many different areas.

I'm off to do the Tongariro Crossing this weekend, so am hoping that the weather is better than the previous one!

Bye for now,
Michelle