Opportunities and challenges for smart glasses in teaching, learning and assessment

by Cat Bailey, Kathryn Woodhead, Helen Nicholson-Benn, John Kelly, Jisc, UK.

Variations of smart glasses have been around for years, but newer models are lighter, cheaper and easier to use. Because of this, it’s worth taking another look at what they might offer learners in further and higher education. In our previous post, we explored several current models and what they can do.

In this post, we focus on how smart glasses fit into teaching, learning and assessment. They can be helpful in certain situations, but they also have clear limitations that prevent them from becoming everyday tools right now. Our aim is not to suggest smart glasses belong in every lesson. Instead, we explore where they might add value (and where they fall short!).

TL: DR

  • Smart glasses have existed for years, but newer models are lighter, cheaper and more accessible, so it’s worth taking another look at what they offer Further Education and Higher Education.
  • They can support remote demonstrations, hands‑free tasks and just‑in‑time guidance, especially in vocational subjects.
  • They may offer benefits for reflection, such as point‑of‑view recordings, and can help learners capture evidence or review their performance.
  • However, technical limits, privacy and practical challenges mean they are not ready for everyday use in teaching and learning.
  • Much of the existing research focuses on older models, so we still lack clear evidence about the real impact of the latest devices in teaching and assessment.
  • Smart glasses may add value in specific scenarios, but wider adoption needs careful thought.

Remote access and practitioner-led demonstrations

Smart glasses can give learners a live, first-person view of an activity they cannot attend in person. This can be helpful for complex tasks such as clinical consultations, lab work or field activities, where learners can watch remotely. Practitioners can show exactly what they see, which could give learners a clearer sense of scale, detail and context than a fixed camera.

Kent and Medway Medical school for example, are using smart glasses to connect GP surgeries with medical students so they can observe real consultations when placement capacity is limited. Other research from Pour et al., (2025), highlights how surgeons are even using smart glasses to give students more detailed visibility during real-time complex surgical procedures.

However, there are challenges. A live video feed can be unreliable due to poor network connectivity and there are recording limitations in the smart glasses, as discussed in our previous post. The view can be shaky or narrow, making it harder for learners to follow the action, or causing motion sickness to watch. Practitioners wearing the glasses may also feel pressure while being observed, which could affect their performance. There are also safeguarding and consent issues, especially in healthcare or public settings, where patients or bystanders must agree to any recording or live streaming.

Hands-free learning

Some smart glasses, such as Vuzix and Epson Moverio, can place digital text and graphics over the real world. Through augmented reality (AR), practitioners can share prompts, safety information, step-by-step instructions and even questions/ feedback in the learner’s line of sight. This type of support works well in simulations, escape rooms, lab sessions and field trips, where learners need both hands free and can’t always keep checking a device.

This AR functionality could play a helpful role in specific assessment scenarios. Mirza et al. (2025) highlight that AR tools enable learners to interact with 3D objects in real environments, support in situ guidance and create authentic, practice-based experiences. These features make them well suited to skills-based learning and vocational subjects. In healthcare, for example, learners valued being able to view patient notes without looking down at a phone or tablet. They could maintain eye contact with the patient and their families, with some learners noting how they felt an increased confidence in their communication skills and ability to talk to patients in difficult situations.

There are similar benefits in engineering and motor vehicle repair. Rudra (2025) suggests smart glasses can be used to overlay assembly instructions directly onto equipment. This helps students follow tasks at their own pace and reduces the need to stop and check manuals or videos. Introducing learners to smart glasses now may also prepare them for workplaces where these tools may be used in the future. This was highlighted by attendees at our recent assistive technology meet up who shared how important it is for healthcare learners to understand emerging technologies like smart glasses, as their future patients may use them. Another example is Sanofi, who use devices like the Microsoft HoloLens 2 to support production training. Microsoft claim the glasses have helped Sanofi to shorten training time, increase productivity (as staff are freed up from training others) and spot where staff need extra support.

However, there are downsides. AR overlays can distract learners if they are poorly designed or too busy. Staff may need extra time and training to create clear and accessible prompts. The hardware itself can be expensive, and institutions must consider the long‑term value, especially as Microsoft have discontinued the HoloLens 2, with updates ending in 2027. This raises questions about the future of similar augmented and mixed‑reality devices in industry and education.

Another practical barrier could be that several models, including the Meta Ray-Ban glasses, require a paired smartphone for core functions. This means institutions either need to purchase a phone for each device or rely on learners to bring their own. Both options create complications. Requiring staff or students to use their personal phones could be a problem for those who don’t have a compatible device. It may also conflict with existing policies that prohibit phone use in the classroom. Buying dedicated phones adds to the total cost, which could be overlooked when budgeting for smart glasses.

Just-in-time learning

Smart glasses can respond to information, sounds and the location of a learner, giving them support at the moment they need it. Wu et al. (2024) showed how they can deliver location-based, contextual information for a cultural tour. This could benefit exhibitions/ showcases, fieldwork or campus tours, with immediate, on-the-spot guidance.

The new Meta Ray Ban Display glasses, for example, show live captions inside the lens. This could support hard-of-hearing learners or help international students follow spoken content through quick subtitles or translation. The XRai glasses have been specifically designed in this regard to offer captioning to deaf/ hard of hearing users.  However, we still do not know how accurate or practical these captions would be in large lectures or busy seminar rooms, where noise and fast conversation could make them harder to use.

Smart glasses can also recognise people or objects to give contextual cues. The HTC VIVE Eagle for example, has the capability to recognise faces and prompt the user as to their identity. Looking ahead, these capabilities could expand further to identify objects or people within an environment and provide on-the-spot reminders or prompts. This on-the-spot prompting can be really helpful for neurodiverse learners, especially those who experience executive functioning challenges (such as difficulties with working memory, planning/ organisation, and breaking down tasks). As the technology advances, there is some hope that contextual prompts could help to offload cognitive demand, reinforce routines, and even support emotion recognition for neurodiverse learners (Rudra, 2025).

However, privacy concerns have been raised about the use of facial / object recognition features. Concerns centre on the fact that smart glasses can record people without them noticing and upload those images for AI processing, sometimes without the bystander’s consent. Some glasses have taken these concerns into account, Envision’s assistive technology glasses for example, already learn and prompt the user on faces, but the user must obtain consent from the person beforehand.

Reflective learning

Many smart glasses can record point-of-view (POV) footage. This can help learners on practical courses capture evidence for portfolios or engage in self/ peer reflection. The footage is often lower quality than head-mounted or handheld cameras and storage may be limited. Even so, smart glasses can still be useful in some learning contexts because the POV perspective itself offers unique insights.

One use case is self‑reflection. Some teachers in training have used smart glasses to “see themselves through their learners’ eyes”. This POV perspective helps them to become more aware of their body language, tone of voice, and eye contact (Reed et al., 2023).

Similarly, learners can also record experiences in the moment, which supports memory recall and helps them reflect on what they were thinking or feeling at the time (Kim, Seo and Shin, 2024). Teachers can use these recordings to give targeted feedback by pausing at key moments and encouraging discussion. They can ask questions such as, “Why did you hesitate at this moment?” or “What else should you have done here?”. Seeing an activity from this viewpoint can reveal behaviours or decisions that are easy to miss in real time.

However, POV recordings can have downsides. Teachers and learners have noted the restricted field of view can obscure essential details or remove important context (Kim, Seo & Shin, 2024). This can make it harder to offer meaningful feedback. Learners may still struggle to identify areas for improvement from the footage alone, they often require a clear debrief from a teacher or structured peer feedback to support reflection. On top of this, teachers must feel confident using the technology to run these debriefs. Both demands add to staff workload and may discourage staff from using smart glasses in teaching.

Being recorded can also make people feel self-conscious, which may affect how they act. This can influence both the wearer and those around them. However, this issue is not unique to smart glasses. Any new recording tool can cause the same reaction. This is known as the Hawthorne effect, where people change their behaviour when they know they are being observed. The study mentioned previously by Reed et al. suggests that students find smart glasses offer a better learning experience over mounted wall cameras, so it will be interesting to see if further studies examine the difference between a head-mounted camera and smart glasses.

Data governance and privacy concerns

Smart glasses raise important questions about privacy and data protection. A quick google of ‘wearable technology policies in education’ shows you that some colleges have added smart glasses to their list and are already prohibiting students from recording in classrooms without express permission.

Institutions and teachers must take care to make sure students are not photographed or recorded without consent. This is straightforward in a controlled classroom, where expectations can be set in advance. It becomes far more complex in public or semi-public spaces such as with their employer, doing fieldwork or just in an outside area There is a real risk of recording people who have not agreed to be filmed. This places extra pressure on teachers who need to make sure students are not recording people who have not given express permission (Grace & Haddock 2023).

As mentioned in the first blog post of this series, many smart glasses are equipped with flashing LEDs to indicate when photos or recordings are taken. However, it noted that there are workarounds to turn off the light and people in the background may not even notice a flashing LED if not expecting it.

Safeguarding learners from online bullying is also an important consideration. Recent stories from the BBC have highlighted how members of the public have been filmed without their consent and then ‘trolled’ online.

Assessment

As we were writing this blog post, the US College Board, have added smart glasses to their prohibited devices list for the SATs. However, it does note that this doesn’t apply to students with accommodations. Presumably, this means that smart glasses approved for use as assistive tech are permitted. This shows that assessment bodies are not rejecting the technology entirely, but are trying to balance fairness, security and accessibility.

In the UK, there is currently no national ban on smart glasses for exams or coursework. Awarding organisations, universities and colleges set their own rules. Some institutions have begun to publish policies on assessment and wearable technology, mainly to avoid unapproved recording or communication during assessments. This is similar to existing rules and policies for smart watches and phones.

Conclusion

Smart glasses have existed in many forms for years, and each model has offered different functions and features for users. More recently, newer designs have become lighter, more affordable and easier to wear. These improvements, along with added accessibility options, make them worth another look. But we also need to ask whether we are truly at the stage where they should be adopted more widely across FE and HE for teaching and learning. Some features still feel underdeveloped, and at times, our reactions to these features have been simply, “So what?”.

It’s worth noting that due to the publishing cycles of academic research, much of the research we reviewed draws on older models of smart glasses. There is far less peer‑reviewed research on the newest models released in the past year. This gap makes it harder to judge how well recent improvements translate into real learning benefits. As a sector, we need more up‑to‑date evidence before making firm decisions.

Therefore, we would love to hear how colleges and universities are using smart glasses already, or how you hope to use them in future.

Authors: Cat Bailey, Kathryn Woodhead, Helen Nicholson-Benn, John Kelly, Jisc, UK.

Editor’s note: this blog post was first published on the Jisc blog.