EdTech Needs More Accountability

Over the past few weeks the effectiveness of educational technology has been thrown into serious question. Tablets in the classroom have been described as a behaviour management nightmare and the educational impact of large technology purchases has been thrown into question. These issues around edtech don’t seem to be going away any time soon!

As much as we can rush to defend technology, we also need to step back and think that these judgements and questions have validity. Technology can undoubtedly be used to in great effect the classroom (and beyond) by a good teacher using tech to achieve specific goals. Yet if you reach out you’ll hear of enough white elephant purchases, bad technology management practices and even lessons where students are left to just search on Google to make your skin crawl!

Go out and network with schools and ask questions about their use of tech. It won’t be long before you will find bulk tablet purchasing without MDM systems to push apps. VLEs purchased without any planning, training or coordination. Hundreds of laptops on a barely functional domain. Tech shoved into the classroom with no advice on structured or effective use. The list could go on…

So, should it really surprise us that these purchases aren’t always having an impact? No! Not at all! The question is how do we move on to ensure that technology almost always has an impact?

This all comes down to accountability. Tech purchases should be held accountable for impact in the same way as lessons, interventions or any educational initiatives in a school. Tech should be well planned, properly supported and held to account for its impact. If a tool or method isn’t being used well, or can’t be linked to improved outcomes, it’s important to stop wasting time and money on it!  The model I like to use is a little acronym called OATSS. It’s not rocket science, but by sticking to this model we’ve got much better oversight over the effectiveness of the educational technology that we deploy and the technology is accountable for being effective.

accountability_chickens

1. Objectives

targets

This takes an institutional mentality shift, but is well worth it. Before any e-learning platform or product is purchased, you should outline key objectives that you want the product to achieve. These should be specific regarding usage, related to the attainment of students and linked to any specific outcomes that you wish to see (e.g. student independence, students accessing their work remotely off-site, improved feedback, better engagement with reading etc.). These are very useful when you come back to evaluate the product’s success after a year or so.

This also doesn’t take the snobbish approach of poo pooing new products and services that people see at conferences and get passionate about. They might turn out to be good investments! Likewise, when you sit down and plan what you want the product\service to achieve you might just realise that there are better ways to achieve the objectives.

2. Assessment

bigstock-Test-word-on-white-keyboard-27134336

Before rolling out a service or product to large groups of students it is important to assess it in controlled real world conditions. You need to be sure that everything is working fine before there’s huge levels of disruption. If you can, ask the company for a free trial period. Companies may be hesitant, but if you explain that this is just for testing, and you won’t buy an e-learning package without proper testing, most good companies will support you. If they don’t want you to try before you buy, you really should be asking why! It might be that trials frequently put off customers. That should speak wonders about the product!

The assessment period is designed to iron out problems with product implementation in a controlled environment before you roll out to large groups of students. You should also carry this out in “real world” conditions. For example, if thirty students are meant to be using an app, at once, in the same classroom, and on iPads, you should see exactly what this looks like (and what goes wrong).

Consider the following during the assessment period:

1) Does the package operate properly on your dominant devices? If not, can you make alternative devices available at the times that it will be used? Can the company provide an alternative version of the package that works on your dominant devices?

2) Does the product\service work properly through the firewall? What changes need to be made for the platform to work properly?

3) Is the tool very bandwidth intensive? If so, can it be feasibly operated over WiFi? Do you need to use the tool in an environment where all devices are connected by wired ethernet? Can you free up enough computer rooms to enable this?

4) Do any other hardware/software/group policy settings need to be made to get the platform bug free? For example, before we moved to FrogLearn in 2013 we only needed Internet Explorer 9 on a Windows domain environment – YouTube was about as adventurous as we got. After that point we needed to ensure that all machines had Chrome as the default browser to give the best user experience with Internet Explorer 11 on hand as a handby backup.

5) What are the “quirks” of the platform? How will you teach staff and students to cope with them? Every platform has them (even the best). They are never written down anywhere but they can cause real confusion for staff. How will you feed this into a training programme or produce support material for staff and students?

3. Tactical Planning

105640616

After all of the problems from the assessment phase have been resolved, it’s then time to begin working on a strategy to get students and teachers using the platform. This should be clear and well worked out. For example, when launching FrogLearn the strategy that we adopted was to give teachers the opportunity to contribute and lead on specific projects, but also to try to reach students as directly as possible. Hence why we launched a fully resourced student led revision centre. This way we could develop and grow the platform and students would see the platform as valuable. Also, keen staff had a focused reason to get involved with Frog. This was a very successful launch strategy which saw some teachers become increasingly confident to use the platform in other ways with students.

This will be explored more in an upcoming post, but how will you make it easy for students to access the platform? Can you allow single sign on using a really convenient login method such as Active Directory, Google Apps or an existing VLE? This will increase ease of access and eliminate the annoying issue of students forgetting passwords.

4. Support

CSP

How will the platform be updated and maintained? Who is responsible for this? Do new staff and students need to be manually imported as users? If so, is there a member of admin staff who can do this reliably? Can the tool read and import users from your MIS system to avoid manpower being used to create accounts? If students need logins, then who will give them new logins when they start? If the platform needs any support, who is the named person who will have time allocated to work with the company to make changes if something doesn’t work correctly? If staff need support\training with the platform who do they go to?

You have to be able to answer these questions. If you can not the platform will fail.

Role allocation is key here. If it is a teacher who is the frontline support, their time commitments are so intense that regular maintainance is likely to be patchy. If it is a member of IT or support staff they need to be well-informed about how the product is meant to function so that they offer the right level of technical and training support.

5. Success

5-handling-success

This last part is critical, and is often the step that most tech implementation strategies forget. How will you assess whether the platform, product or service had any impact? There’s a few approaches to choose from:

  1. Examine which faculties\departments\classes used the product the most. Then examine whether they made more progress in comparison with their peers. I suspect that you might see a problem with this. In reality, keen faculties and teachers who achieve good outcomes with students may be the same staff who are willing to experiment with and use new technology. It’s therefore difficult to distinguish corrolation from causation. Is the product actually having an impact? Or is a keen member of staff just happening to use it more? However, conversations with members of staff will allow you to triangulate your data to determine if the product or service contributed to the success of students. Teachers and students can pretty reliably tell you if something is working or not.
  2. Go back to your objectives. Were they met, or are they close to being met with more input? If not, then it is time to question the future of the platform at your institution. There’s no point hanging on and using packages that are not working for you.
  3. Correlate student achievement against impact. A nice method for this is to examine login and usage statistics of platforms for individual students and to examine whether these students performed better than their peers who did not use the platform. Earlier this year we carried out an impact analysis with the Year 11 revision centre and determined that students from all sets who used Frog more regularly to revise made much more progress than their peers. A future post will examine this in more detail.If an e-learning platform doesn’t provide you with login analytics, consider putting a Google Form before the login page to capture usernames and login times. We have developed PHP scripts that does this silently in the background without users realising that Frog is forwarding their username as a query string into a self submitting Google Form.

Hopefully, this model outlines a method for more accountability and management of educational technology. Please drop a comment below if you would like to share your methods and procedures for implementing, and tracking, the effectiveness of educational technology at your school.

Facebooktwitterredditpinterestlinkedinmailby feather

Leave a Reply

Your email address will not be published. Required fields are marked *