Over the past 20 years, the visual and audible reach of the classroom has expanded exponentially, such that it is almost inconceivable now to imagine teaching geography solely via the medium of maps and books, or languages via a teacher’s cassette tape. The use of documentaries, virtual environments and high resolution audio recordings means that students can have a much more authentic experience of topics such as volcanoes, gravity and the workings of a heart than any of their predecessors.
These technologies are an undoubted boon where learning can be effectively transmitted via eye and ear. But what happens when the subject matter at hand relies on a sense of touch, or of feeling something, in order for it to make sense? That is where haptics comes in.
What is haptics?
Haptics is an emerging technology that seeks to close the gap between what one can see and touch in the virtual world, and also to bring to life concepts such as the periodic table, dissection, physics and product design. This video explains further.
Here are four new ways of teaching those topics:
- Teach the periodic table by asking students to pull virtual elements apart and arrange them according to how much resistance they encountered.
- Support students doing A Level Biology by giving them unlimited and free opportunities to dissect an eyeball, cutting through objects of different density and resistance until they are completely familiar with the process.
- Promote understanding of sound waves and personal health by creating a digital ear-drum and using a haptic controller to feel how it responds to sound waves.
- Enable product designers to engage emotionally and physically with objects by using CAD software that can assign properties of density to their objects.
Touchable Universe project
In phase one, we proved the concept by creating demonstration apps that allowed students to feel a hard tooth set into a squidgy gum, even though both existed only as digital objects rendered on a screen. We also created an app that allowed students to ring tubular bells under water, at sea level and on the edge of space to see how their apparent weight and sound output changes at different heights.
Teachers were uniformly positive in their response, and we are now building an expanded offer for schools nationally. Highlights will include:
- The dissection and ear drum apps described above, developed with support from OCR’s panel of expert teachers
- A new software development kit enabling teachers and students to create their own haptic apps
- A ‘lite’ version of our haptic 3D modelling software (derived from Cloud 9, which won the 2013 3D Print Show’s Global Award)
- Support for the BBC’s Micro:bit in the form of customisable and 3D printable cases, and advice on how to convert it into a haptic controller or internet of things device
In developing the concept into classroom-friendly, effective and affordable apps, we will be following a classical design process vested in the experiences and aspirations of participating teachers and students. Schools that – join the project will be given support from design and education experts, and inform the learning of others via communities of practice.
Our intention is to discover whether students who have access to haptic apps develop deeper levels of comprehension and retention, higher levels of enthusiasm for the subject and higher skill levels than their peers – as tested in the arenas of coding, dissection and 3D making. In turn, we hope that this will drive greater interest in STEM, Art and Design within schools, while giving teachers an opportunity to refresh and develop their approach to learning.
The project development timeframe is 1 December 2015 to 31 March 2016. If you would like to become involved, please feel free to get in touch by sending an email to firstname.lastname@example.org
Alternatively, you can find us at BETT where we will be speaking each day as part of OCR’s ‘Learn Live’ stand.