CMALT: professional status for people who support learning using technology

December 1, 2015 at 11:51 am | Posted in Uncategorized, communication | Leave a comment


I was delighted to read that Elizabeth Charles, Head of E-Services & Systems at Birkbeck, University of London, has become the 300th person to achieve CMALT and become a Certified Member of the Association for Learning Technology (ALT). I gained CMALT in 2009, soon after the scheme started, and have assessed one or two applicants a year since then. I am pleased to say that I was lead assessor for Elizabeth and thought that she submitted an exemplary portfolio that clearly evidenced her deep understanding of the Birkbeck learners and the ways in which she could use technology to support their learning. Well done!

A small group of ILIaD staff are also now in the process of developing their CMALT portfolio applications, so if you are interesting in joining us or finding our more, please get in touch. You can read more about CMALT and the application process on the ALT website.

Informal learning spaces in student halls

November 30, 2015 at 2:59 pm | Posted in educational | Leave a comment
Tags: ,

Mayflower informal learning space

I’m involved in the University’s Common Learning Space (CLS) working group, helping to plan the institution’s teaching facilities, and have just completed an analysis of a survey of academics’ experiences teaching in these rooms. I hope to include some of these findings in a later blog post, but one immediate outcome was a meeting with David Podesta, Senior Estates Manager for our Residential Services.

He wanted to show me round a new informal learning space on the ground floor of the University’s new Mayflower Halls, a development of multi-story flats in the centre of the city with 1100 study bedrooms. When I visited it on a rainy Monday morning there were only a couple of students working together, but it is really well used in the evenings and late into the night, providing students with a social alternative to lone study in their bedrooms. There are a variety of tables and chairs, a coffee machine and a handful of bookable group study spaces (see photo) with a screen to enable students to share the output from a laptop or tablet. We talked to the students and they really liked using the space, but asked for a printer to be available – so some valuable feedback from its users!

The aim is to develop similar spaces in all the halls of residence, perhaps repurposing the student bars which are struggling to remain profitable as students adopt a culture centred on cafes and clubs. The University has already invested in similar social learning spaces on campus, so the move to decentralise them is an interesting and welcome initiative.

TEAMMATES peer feedback system – review

November 13, 2015 at 2:52 pm | Posted in systems | Leave a comment
Tags: ,

Diagram showing TEAMMATES features

TEAMMATES ( is a free online system that facilitates anonymous peer feedback between students working in groups. It has been developed since 2010 by academics and students at the School of Computing and the Centre for the Development of Learning and Teaching at the National University of Singapore, and their intention is to keep it as a free service. My view is that it is a well-designed, mature system that offers an excellent user interface and experience. It runs on the Google App engine which provides strong stability and scalability, enabling it to cope with large cohorts and groups.

Tutors do need to have a Google account to use TEAMMATES, but students do not need one and can submit responses and view their feedback without ever needing to login or sign up. However, if they do login to TEAMMATES using their Google account, they can access all their TEAMMATES courses and feedback in one page and create a user profile.

Here is an overview of the process:

  1. Tutor logs in to TEAMMATES, creates a course and enrols the students/teams by simply copying the data from a spreadsheet (team, name, email, comments).
  2. Tutor creates a session and adds the questions they wish to ask. It is easy to copy and then edit an existing session.
  3. When the session opens (at a scheduled time) the students are emailed a unique link that they use to access their feedback form.
  4. The tutor can view the responses submitted at any time.
  5. Students are sent a reminder email 24 hours before the session closes (at its scheduled time). Extra reminders can be sent manually if required.
  6. After the session closes the tutor can review the results and then click publish to email the students with a unique link to view their individual feedback.
  7. The tutor can download the session results as a spreadsheet file. The results could be used to adjust individual student grades for group projects depending on their peers’ assessment of their contribution.

A real strength of TEAMMATES is the range of question types available and the flexibility of the feedback paths and visibility that can be easily assigned to them. For example, the question below asks students to provide feedback to the other members of their team about their ‘contribution to team meetings’ by choosing an option. The visibility has been set so that the feedback is anonymous, and only visible to the recipient (of the feedback).

TEAMMATES question creation 1

However, note that many other visibility options are available, so if the question asked for comments about ‘strengths that the student brought to the team’ then that feedback could be shared with the rest of their team and the givers identified. This opens up many interesting possibilities for generating constructive formative feedback for developing effective teams and team skills.

The rubric question format makes it easy for students to provide feedback on a range of issues using a compact format. The rubric can also be shared at the start of the project so that the students have clear guidance on the behaviours that are needed to get good marks.

TEAMMATES question creation 2

The overall ease of use of the system is also a major plus, as it encourages multiple formative feedback activities during a group project/assignment. For example, near the start of the project a simple session (form) could provide team members with early feedback on whether their performance requires improvement. Later on, another session could be used to help the team keep on track, then a final session could be used to assess effort, contribution and teamwork and summatively use the scores to individualise grades.

ParticiPoll – review

November 6, 2015 at 12:46 pm | Posted in student response systems | Leave a comment

ParticiPoll is a online service that enables simple polls to be added to Powerpoint presentations. At the moment it only works with Windows and PowerPoint 2010 or later – see their how-to guide for a overview of its use.

The system enables presenters to add a multiple-choice poll (6 choices max) to any slide that lists those choices using A-F bullets. Voters use any web browser to navigate to a unique URL – in my case – to make their choice. The add-in makes it easy to add a large QR code to a slide to simplify access. The voting screen always shows six possible choices and does not refresh after each vote – I liked this and thought it made it really easy to use – just click to vote when a new poll is shown. The free version shows adverts, but they did not seem to be intrusive.

ParticiPoll voting interface

The presenter can see the number of votes cast and the resulting bar graph is hidden until they advance the presentation. The vertical bars show the % and number of votes for each option.

In the free ad-supported version, there is no limit on the number of voters or the number of polls per presentation. Pro licences are available for $10 per month or $100 per year, and enable customisation, private polls, and download of poll data. Crucially, they also enable live audience comments which are shown on a separate web page.

Overall, I thought this was a really easy to use system, and the availability of a low-cost monthly licence with audience comments makes it ideal for occassional events.


It requires an Add-In to be downloaded and installed, so if academics wish to use it in teaching rooms they will need to install it on a laptop and use that to present. Alternatively, ParticiPoll also provide a macro-enabled file that needs to be run before you open your presentation – I tried this and it seemed easy enough to do at the start of each lecture.

EMA: the Electronic Management of Assessment

November 2, 2015 at 11:23 am | Posted in Uncategorized | Leave a comment
Tags: , ,
Diagram showing assessment and feeback lifecycle

The assessment and feedback life cycle (adapted from an original by Manchester Metropolitan University) CC-NC-BY-SA

One of the great learning technology success stories of the past few years has been the rapid growth of e-submission where students submit their assignments online instead of in person at the Faculty office. This enjoys almost universal support from students due to its convenience, with no need to print the essay or queue to hand it in by the deadline. Unfortunately, that is also the point at which all kinds of challenges begin:

  • are those essays printed and then distributed to the markers by administrative staff?
  • or are markers willing to change their working practices and mark on-screen?
  • how are the grades and feedback returned to the student?
  • how does the system support penalties for late submission or extensions with mitigating circumstances?
  • does the system support policies such as anonymous and double-blind marking?
  • does the system support moderation and consequent changes to grades?

And within these larger challenges are important QA details such as whether an audit trail is retained showing both markers original grades and comments as well as the moderated grade and comments.

There are currently four systems in use at Southampton; the Blackboard Assignment tool, the Turnitin originality-checking system, our JISC-funded institutional e-Assignment tool, and the handin tool developed and used only by Electronics and Computer Science. None of them provide a complete solution and there is no institutional policy mandating their use for all (suitable) assignments.

Southampton is one of the partners on a JISC project about the Electronic Management of Assessment (EMA). This aims to support institutions who wish to develop their policies, processes and systems, and share case studies and examples of good practice. It recently published an online guide Transforming assessment and feedback with technology, and is developing a toolkit that institutions, faculties and departments can use to benchmark their current use of EMA and plan their next steps. ILIaD will be working with colleagues in faculties, iSolutions and Student and Academic Administration to help the University take a step change forward and ensure that everyone (students, academics and administrators) gain real benefits from online submission, marking and feedback.

Lecture capture for teachers

June 16, 2015 at 10:20 am | Posted in Uncategorized | Leave a comment

The University of Southampton has invested in Panopto, an institutional lecture capture system that is hosted on a cluster of high-performance servers and integrated with our media servers. This level of investment is clearly unrealistic for schools and colleges, but there are all kinds of low-cost alternatives.

Lecture capture services
If a college intends to make widespread and frequent use of lecture capture, then systems like Panopto are available as hosted services – so there is no need to buy or manage any servers. A key issue to consider will be the upload bandwidth of the college’s internet connection, which must have the capacity to handle the video data being sent to the service. On the other hand, when learners access those lectures from outside the college, it will not use any of its bandwidth.

Video hosting
This issue of where the recorded lectures are hosted is a key consideration. Although they could be hosted on the college’s web servers this is not ideal as they are not designed to deliver lots of streaming video content, and the college’s internet connection will also probably be a bottleneck. It makes much more sense to host the videos using a commercial service such as YouTube or Vimeo, and many of the alternative below can publish videos directly to YouTube. The videos can be published as ‘private’, so that only people who know the URL can access them, but teachers should always be aware that social media means that any video is potentially public!

If a computer presentation (PowerPoint, Keynote, Prezi etc.) is central to the lecture, then a simple option is to record the PC screen and the teacher’s voice. The software needs to be installed on the PC and this often means using your laptop rather than the PC installed in the teaching room. Costs range from zero for the free open-source CamStudio to around £120 for an educational licence for Camtasia Studio.  Alternatively, there are low-cost online screen recorders such as Screencast-O-Matic that can be used on any computer. Google screen cam software to see similar software and services.

Good audio quality is essential, and the microphones built into laptops may not be adequate. A USB boundary mic will allow you to move around the classroom but will also pick up audience and environment noise. Wired tie-clip mics on long leads (4m) are cheap and work reasonably well. In your office or at home, a USB headset (£25) works really well.

Screencam software will also enable you to record the image from a webcam – as an alternative to the screen, as a picture-in-picture, or switching between the screen/webcam in the editor, depending on the capabilities of the software. You will of course have to position the webcam and ensure the lighting is good. Some webcams have really good microphones and can be used instead of a boundary mic, even if you choose not to record the video.

Flipped learning
It is probably more educationally effective for you to record short (up to 10 minute) videos that target specific learning outcomes and use these as self-study resources to support your face-to-face teaching activities. These recordings can be made at your desk, or more likely at home where it is quieter and you won’t be interrupted. You may want to script what you say to ensure you can record in one take (after a couple of practice run-throughs) and minimise any editing required. [More on flipped learning]

Both iPads and Android tablets have a wealth of apps that can be used to make educational videos, such as Explain Everything. [More alternatives]. These are best suited to recording videos for flipped learning rather than capturing live lectures. Alternatively, you can attach the tablet to a tripod and use it like a video camera if all you need to record is the teacher rather than a presentation.

Poll Everywhere – review

February 6, 2015 at 10:23 am | Posted in student response systems | Leave a comment

Poll Everywhere is a commercial web service that enables live audience voting. There is a ‘try it free’ option for educators which allows up to 40 responses per poll, an annual ‘per-instructor’ option at $349 (up to 400 responses per poll) and an annual institutional site licence for around $3 per student (1000+ responses per poll). The terms and conditions are really clearly explained in plain English – a welcome touch and done with humour! A key strength of this service is the range of options available to voters who can respond via the web, via SMS txt message or via Twitter.

Poll Everywhere tutor's screen showing timer and presentation controls.

Poll Everywhere tutor’s screen showing timer and presentation controls.

Creating a Poll

The tutor logs in to their Poll Everwhere account, clicks the Create Poll button and types the question stem. They can choose whether audience response is open-ended (i.e. free text response), multiple choice (one answer) or clickable image.

The next step is to configure the poll:

  • how people can respond (website/SMS txt message/Twitter);
  • how many times each person can respond (default is once) and whether their response is anonymous (this option applies where registered  participants are responding);
  • auto start-times and stop-times if required, so polls can be scheduled.

The tutor should then test the poll by making it active and voting using the on-screen simulated ‘mobile phone display’ and/or real phones/tablets/etc. to check that the question works as intended. The test results are cleared and the question can then be presented, in fullscreen mode if required.

Multiple-choice questions: the question stem can only be text; there is no image option. The answer options can be text, an image URL or an uploaded image file. The images are resized, but some care still needs to be taken to ensure they work at the small size displayed in the poll. The text can be maths equations expressed using Latex ; just use the prefix “latex:” e.g. latex: V = \frac {4} {3} \pi  r^3

Update: the Visual Settings for the question allow you to upload a logo (JPG, PNG, GIF) that appears above the question stem – and of course this could be part of the question (“what does this photo show” or “name the feature labelled A”). The image cannot be too large, as the poll options also have to fit on the screen and are resized so that they do… so big image = tiny text. So this is a work-around that may meet some needs, but questions about detailed images will need to appear on a separate screen. For example, the tutor could display the image (in PowerPoint? SlideShare?), switch to the browser to show the question and options in Poll Everywhere, then switch back to the image while the students ponder which answer to choose on their devices.

Open-ended questions: display as text wall, word cloud, cluster or ticker. Default is ‘respond as many times as they like’. A profanity filter is available to censor or block responses that include profanity – the default is ‘anything goes!’ – but it is easy to defeat the filter by using accented vowels – Oh cräp! Moderating responses before they appear on-screen is only possible for paid accounts, but can be done by the tutor or an assistant on a separate mobile device to avoid any risk of inappropriate messages being shown to an audience. The word cloud treats every word in a response as separate (e.g. “Chromium Dioxide” appears as “Chromium” and “Dioxide”) but filters out common words like “the”.

Clickable image questions: the tutor uploads a JPG or PNG image which is automatically resized. To track the number of clicks on selected areas, multiple rectangular areas can be defined. These have a minimum size so do not allow precise selections, and of course the areas cannot overlap. The areas (and their number of votes) can be shown/hidden on the tutor’s screen during the poll, but are never visible on the students’ screens. The tutor can also choose to display the precise location of each click.

Groups of polls: multiple polls can be selected and added to named Groups. Polls can be added and re-ordered using drag-and-drop. The tutor can then step through the polls in that group in order, activating each poll to make it visible and starting/stopping voting as required.

Control during polling

The default is to show the polling results live as the responses come in, so the chart updates in real time. However, a control makes it easy for the tutor to hide the mutiple-choice chart and just show the question and its answer options during the vote.

A countdown timer is available on the polling screen – the tutor just types in the number of seconds and clicks to start the timer. Students will not be able to vote after it has reached zero, but it can be paused. The tutor can also manually stop the poll, restart it and clear the results if required.

Poll Everywhere tutor's display showing answer options instead of live voting chart.

Poll Everywhere tutor’s display showing answer options instead of live voting chart.

Remote voting on polls

The tutor’s presentation screen for a poll has a Share control panel with three options:

  1. share via a web page which displays the poll currently activated. This would enable students at remote locations (possibly watching a streamed lecture) to vote online and see exactly what the on-campus students would see.
  2. share via a web page which only displays that specific poll, but allows people to answer that question at any time (for example like a mini-survey). They cannot see the results.
  3. share via a web page which only displays the live results for that specific poll. So if people have voted via method 2 above, this link would allow them to review the results.

Options 2 and 3 have buttons that make it easy to share those links via email, Facebook or Twitter.

Integration with PowerPoint and Keynote

The tutor needs to download the free PollEv Presenter App, available for Windows and Mac running Office 2007 or newer or Keynote 5.3 or 6.5. While this initially means that tutors would need to use their own laptops to present, if an instutional licence was bought there is an Enterprise Deployment option available so that all centrally-configured PCs in offices and lecture rooms could have the app by default.

The PollEv Presenter App adds a Poll Everywhere ribbon to PowerPoint that makes it easy to insert a poll. The tutor clicks a button on the ribbon to log in to their Poll Everywhere account and then chooses the poll(s) that they wish to insert among their conventional slides.

Note that the only way to tell which poll question you have inserted at a particular place in your presentation is to look at the notes for that slide – the placeholders all look the same.

Bonus feature: the PollEv Presenter App also makes it possible to insert any web page into a PowerPoint presentation. Just activate that option from the ribbon’s About button.

Bonus feature: if you use PowerPoint for Windows, the free Presentation Remote app for iOS or Android mobile devices enables you to remotely control presentations.

Visual design of polls

The system offers a great deal of control over the visual appearance of slides; colour schemes, fonts, background images, bars/columns, axis lables, response counts/percentage etc. etc. The Settings menu allows you to use any poll as a template for new polls, or to apply that poll’s visual settings to all your polls.


Reports can be created for individual polls or groups of polls. List reports show the response to each poll by each participant, so that voting patterns can be explored. Summary shows the results for each poll – ie. the number and percentage of votes for each option. There are also Survey reports, Grading reports (if questions have scores), Team reports (if segmentation is used) and a Sign-In Sheet (time of first and last selected poll). The data can be downloaded for further analysis if desired.

Premium features not available in the free trial account

The key feature is the ability to restrict access to a poll to registered participants. Tutors can send an email invite which requires recipients to create an account (email address and password). Alternatively it is possible to integrate with Blackboard (and Canvas), so that only students in a specific module can access a group of polls.

Of course once you know who is voting and how they are voting, then the next premium feature is of course grading responses and ranking participants. Tutors gain the ability to mark a multiple-choice response or clickable area as correct, show/hide the correct answer on screen, track participants and rank them according to their overall score on a group of polls. Naturally there are reports that can be viewed or downloaded that detail student performance.

Some licence plans allow multiple users to share the same account – so members of a teaching team could easily share the creation and delivery of polls.

As previously mentioned, premium accounts are also able to moderate free-text responses before they are on-screen and hide any that don’t meet the tutor’s academic standards ;-)

Other interesting features

  • Send people a link to a group of polls which form a single-page online survey.
  • Segmentation, which enables you to correlate the results from a poll with a previous poll. For example you might ask people whether they are male or female and then compare how each group answered a poll about their drinking habits. It can also be used to enable team competitions.


Overall, I’m really impressed with Poll Everywhere. It has an attractive user interface that is fairly easy to use – although there are also a lot of powerful features a mouse-click away for those that want them. The range of voting options is impressive, and the visual appearance of polls on mobile devices is great – without any need to download a special app either. The integration with PowerPoint and Keynote is reasonably good, although tutors will need to be careful about which poll appears where. My next step is to runs some tests with a cohort of students to see the reporting in more detail and then ideally to use a paid account to see how registration works.

Comparing it with Turning Point, multiple-response questions do not seem to be possible and it wasn’t clear if a correct answer to a poll is always worth 1 mark or if the score for each poll can be set. So Turning Point offers more sophisticated multiple-choice question types and scoring, but Poll Everywhere provides live feedback, which is especially useful for the open-ended questions – and the clickable image feature has great educational potential.

I’ll add a follow-up post once I’ve explored its reporting features in more detail.

Using mobile devices to record one-to-one meetings

January 23, 2015 at 1:05 pm | Posted in hands-on | Leave a comment

I was asked recently to provide some advice to some students who need to video dyslexia assessment and tutorial sessions in order to provide evidence for their professional practice qualifications. However the following advice also applies to any one-to-one sessions. The objective is for them to use the devices they own (laptops, tablets or smartphones) to create digital video files which can be securely shared with their tutor.

Basics: make sure that the room is well illuminated and that the camera is positioned to get a good view of the participant’s face, hands and any materials that you are using for the assessment.  Try to make sure that the participant is not back-lit against a window as this will typically make them difficult to see clearly. You (the assessor) will also need to be in shot.

Audio: recording good quality audio is essential, and although the microphone built into the device used may be adequate, you may need to use an external microphone. This is one of the key reasons that you need to practice making a recording well in advance of the real thing.

Laptops: many laptops have webcams built-in, but positioning the laptop in the room may be difficult. You may need to buy a low-cost USB webcam that can be fixed to a tripod so that it is at the right height and angle.

Tablets/smartphones: remember to make sure they are on their sides (landscape mode) as video shot in portrait will appear sideways on your tutor’s PC screen.

Tripods: you can use a camera tripod or bodge something using elastic bands, masking tape and a music stand. For example very few USB webcams have a tripod screw-mount, so you’ll have to tape it to a tripod/stand. You can buy special cases or grips that make it easy to attach a tablet or smartphone to a tripod/stand – it is worth investing in one of these if you need to make more than one or two videos.

File format: ideally MP4, a highly compressed format that can be readily viewed using free software. You should also record at a medium resolution (such as 720p) as the extra visual detail provided by high resolution files (1080p) is not necessary and just leads to much larger files to upload/download.

Finally, it is essential that you make at least a couple of short (5 minute+) practice recordings well in advance of the session you wish to record so that you are confident with the equipment and its setup, the recording options (audio and video quality/filetype) and any post-production needed. For example, how do you export the video from your phone/tablet in the correct format to a PC so you can upload it securely to the person assessing the session? You may need to search the web for advice on how to do that for your particular device/software. Or just ask a teenager ;-)

Note that although it is probably easy to upload video directly from a phone/tablet to an unlisted YouTube video (so you could simply send your tutor the link) you must not do this as it is insufficiently secure to meet professional requirements for confidentiality. At Southampton we have our internal Dropoff service which does meet those requirements.

Zappers zapped.

December 4, 2014 at 3:47 pm | Posted in student response systems | Leave a comment

SRS handet being struck by lightningAfter seven years of use, I’ve decided it is time to zap the zapper – no one else uses that term and it really doesn’t describe what they do at all. So all hail the clicker.

Now to update a pile of documentation and web pages…

Stumbling into some pitfalls with ResponseWare

December 4, 2014 at 3:08 pm | Posted in student response systems | Leave a comment
Tags: ,

I’ve just run a session introducing the online ResponseWare Student Response System to some academic colleagues, and have identified some pitfalls… by stumbling straight into them. I should start by saying that I think online SRS are the way to go, especially now most students have a smartphone, and all of our lecture spaces have excellent Wi-Fi following an ambitious upgrade project. ResponseWare has the advantage that it integrates perfectly with the Turning Point clickers that are already in widespread use across the university, and that it therefore minimises the learning curve for tutors and does not require them to recreate their resources and quizzes for a new system.

So, the gotchas were:

  • If possible, iOS and (especially) Android users should install and use the ResponseWare app rather than using web-browser access; it gave a reliable and superior user experience. Of course the app needs to be up to date (v2) and will need updating again before the end of the year following another upgrade to the ResponseWare service.
  • If not, iOS and (especially) Android users should use an up-to-date version of Chrome rather than the default browser. I had one user with an iPad v1 (iOS 5) and another with a Samsung tablet using the default Android browser (Internet) – neither  of which worked.

There was also an embarrassing gotcha in my presentation:

  • The response grids for my short-answer questions used an unreadably pale grey from the slideshow’s colour scheme; I should have tested the presentation first before delivering it. Mea maxima culpa…

One of the attractions of ResponseWare is that students without suitable mobile devices (or have run out of battery charge) can be given a clicker so they can still take part in the voting. The tutor just needs to bring a small number of clickers (enough for 10% of the cohort perhaps) as well as a USB receiver. This will work fine for multiple choice questions, but will not work for short-answer questions – and this may become an issue as tutors start to take advantage of the short-answer questions enabled by ResponseWare.

One of the participants asked whether requiring students to use their phones/tablets/laptops in sessions will simply encourage them to become distracted by Facebook, Twitter, SnapChat etc. etc. etc. My answer was that there are plenty of legitimate uses for such devices (such as notetaking or looking up references) and that we need students to develop the self-control to pay attention to their own learning as well as our teaching – especially if it has been made more engaging through the use of SRS and the pedagogic techniques they facilitate.

Nevertheless, this seems like a good place to repeat that link to to a post by Clay Shirkey which includes strong evidence against the use of laptops, tablets and phones in class, except when specifically requested by the tutor. He argues that they are a constant (and highly effective) form of distraction, and that multi-tasking interferes with learning: “Multi-taskers often think they are like gym rats, bulking up their ability to juggle tasks, when in fact they are like alcoholics, degrading their abilities through over-consumption.”

Perhaps academics need to devise educational approaches that require students to make effective use of their mobile devices, which scaffold and help model good practice while discouraging off-topic uses. I suspect that social (collaborative) learning will be at the heart of this since it is the lone (isolated) student who has the greatest motivation to get distracted by communicating with friends or browsing around something that seems more interesting than a didactic lecture.

Next Page »

Blog at | The Pool Theme.
Entries and comments feeds.


Get every new post delivered to your Inbox.

Join 64 other followers

%d bloggers like this: