Harbottle & Lewis' Daniel Tozer and Natalie Smith discuss how VR devs can guard themselves against product liability and data protection problems

Could VR devs be sued by injured players?

This year promises to be the year that virtual reality finally takes centre stage. Absolutely, definitely, probably, maybe. Headsets are starting to hit consumer shelves, signalling the call to action for developers to create that breakthrough app that brings VR into to the mass-market.

The success of VR is dependent on pushing boundaries, but journeys into unchartered territories often bring with them new legal obstacles and considerations. In this article, we explore some of the potential product liability and data protection issues that developers could face.

VR has been on the cusp of widespread adoption for many years now. Whilst hefty hardware price tags may be an issue for some, other factors challenging widespread adoption are the possible side effects and health and safety risks associated with its use. Motion sickness, nausea blackouts, behavioural changes and eye strain are most commonly cited. But real world falls, trips and bumps are not to be ignored.

But who is responsible? Although on the face of it the question of liability is primarily one for headset manufacturers and VR platform providers, developers also need to be aware of their possible exposure. The most challenging question in relation to liability in the VR context is what or who caused the injury or accident; the hardware or the application? Arguments on this issue are inevitable. Once the cause is established, risk apportionment will inevitably be determined by the terms and conditions between the platform provider and the application developer.

Oculus US’ consumer-facing terms attempt to disclaim all liability for ‘third-party content’ made available through the Oculus platform and application; this would include third-party applications. In fact, they generally attempt to disclaim all liability for injury by pointing consumers to health and safety notices available on its website and asking consumers to accept that use is ‘at their sole risk’.

English law renders any attempted exclusion or restriction of liability for death or personal injury caused by negligence as wholly ineffective. If the developer designs a virtual world negligently or without signposting risks and warnings, they may be unable to escape liability for injury caused as a result. Some examples might include building in appropriate user warnings to remind users that virtual objects don’t hold real weight (a common cause of user accidents), or building in controls so as to ensure that haptic feedback is not used inappropriately.

VR stakeholders in the UK won’t be able to disclaim liability by pointing users to notices that use of the headsets and applications are ‘at their sole risk’. Health and safety must be at the forefront of a developer’s design brief, even before the question of liability apportionment, as the success of any application will be affected by consumer trust in the safety of the product.

If the developer designs a virtual world negligently or without signposting risks and warnings, they may be unable to escape liability for injury caused as a result.

Virtual reality engagement will in most cases involve the transmission of personal data. The ‘virtual avatar’ poses an interesting legal question of whether data collected through the avatar’s interaction with the virtual world can amount to personal data for the purposes of European laws… but ultimately where a user registers his or her details to an account in order to enter a virtual world, personal data is collected.

Any subsequent data then collected about that individual’s interaction with their virtual world (such as for example their location, user engagement, transactions, preferences), when matched with their account will in most cases also constitute personal data. VR provides the opportunity for developers, platform providers and hardware manufacturers to gather significant amounts of very rich data about their users. Whilst collection and use of this information is invaluable for the purposes of gaining insights about user habits, preferences and product improvements, it is governed by increasingly stringent data protection laws.

Readers will likely have heard about the looming General Data Protection Regulation (the GDPR) set to shake up and harmonise data protection laws across the Europe. Although not due to come in to force until Summer 2018, the proposed changes are significant and will likely impact all those interacting with or ‘processing’ personal data. Coupled with the increased obligations are the significantly increased fines for non-compliance. The UK’s data protection regulator (the ICO) currently only has the authority to issue fines of up to £500,000 for serious breaches of data protection laws whereas maximum fines under the GDPR will increase to the greater of €20m or four per cent of worldwide turnover. Yikes!

But what changes are afoot? Too many to mention here but there are a couple to highlight in a VR.

As per existing laws, under the GDPR businesses must have a legitimate reason to process personal data, but the GDPR significantly strengthens the requirements for those businesses seeking to rely on an individual’s consent as their legitimate reason for processing. GDPR consent must be freely given, specific, informed and must be an unambiguous indication of wishes by “clear affirmative action”.

In a VR context, presenting the user with the necessary information and signposting this requisite consent will be challenging. VR stakeholders will no longer be able to bury consent language in lengthy terms or privacy policies but must, as with safety warnings, build consents and privacy notices into the user flow.

VR presents some real-world opportunities and issues. Don’t get caught out.

Businesses must appoint a Data Protection Officer (DPO) responsible for assisting with all data protection matters if their core businesses activities involve regular and systematic monitoring of individuals on a large scale. We await more guidance from EU regulators on the application of these criteria but expect it will apply to many VR platform and application providers collecting user data. Among other things, DPOs will be responsible for overseeing the conducting of data protection ‘impact assessments’ where the processing of personal data involves new technology (such as new VR applications) that are likely to produce a high risk to an individual’s rights and freedoms.

VR businesses will also be required to maintain detailed records of their personal data collection and processing including the categories of information collected, the purposes for which it is collected, third parties to whom the data are disclosed, and so on. This will be an onerous task for any business but especially so for those collecting large amounts of personal data such as VR platforms and application providers. 

Finally, VR stakeholders engaging with personal data will be required under the GDPR to adopt a ‘data protection by design’ and ‘by default’ approach to personal data and product development. Fundamentally, the GDPR necessitates a real shift in the approach to personal data for all. Responsible for designing how, what and for what purposes data are collected, the developer is integral to this shift.

VR presents some real-world opportunities and issues. Don’t get caught out. 

Daniel Tozer is a partner and Natalie Smith an associate at London law firm Harbottle & Lewis. This article is part of our month-long Virtual Reality Special. You can find more VR content here.

About MCV Staff

Check Also

EA suggests that it may drop FIFA brand

It's a rock and a hard place for EA, but it's the right thing to be at least considering a change