Omnicept AI: Data Privacy and Security

As an experimental psychologist, I spent more than a decade learning the art and science of collecting (and protecting) data from human subjects. In fact, one of the things that distinguishes experimental psychology from other experimental sciences like chemistry or biology is the extra level of care and training that is required to ethically collect and responsibly maintain personal, private, and, frankly, invasive information about the very real people who entrust us with their data. It is an awesome responsibility, in every sense of the term, and it is one that I take incredibly seriously.  


When I made the transition from collecting data at a university to collecting data in industry, I was nervous. I have colleagues who work for the biggest companies in research and technology and when I asked how they handled user data; their pregnant pauses did not inspire confidence. Luckily, I can say without reservation, that my experience at HP has been demonstrably different. HP is serious about protecting its users’ data and the Omnicept team’s data policies are on par with university standards.


If you have spent time reviewing the Omnicept launch materials, you may already know about our data collection effort. To develop the AI that supports Omnicept, we collected data from more than 1,000 individuals across four continents over two and a half years. One point of pride for us is that we established a data privacy and security framework that carries over to the Omnicept platform. We believe individuals have the right to control their own data and we want end users to know what data is being collected about them and who has access to that data. Below are some of the specifics of the Omnicept data privacy and security framework.


Privacy Impact Assessment. When we collect user data of any kind at HP, we start with a formal assessment of the impact of the data collection on user privacy. This assessment is conducted by HP’s Global Privacy Office who, themselves, collaborate with external thought leaders and advisors to vet our privacy programs and policies. Consistent with the high standards of university internal review boards, our Global Privacy Office carefully reviews our protocols to keep the human subjects we work with protected and their data secure.


Informed Consent. Prior to collecting data, we conduct a formal consent process to make sure users understand and agree to share their data with us. This includes plain language about why we’re collecting data in the first place, what data is being collected, who has access to the data, and information on how long we keep data and where it’s stored. For data collected internally at HP, we make sure our users know that they can withdraw part (or all) of their consent, at any time, and that they can ask for their data to be retracted, deleted, and removed from our servers.


Deidentification. Deidentifying data means removing information that can tie the data to an individual person (things like email or IP addresses). We deidentify our data locally, separating consent forms from data files and assigning users randomly generated subject numbers. Subject numbers and identifiers are tied together in an encrypted master list that is stored locally and can be accessed by only one or two individuals who are specially trained to handle sensitive data (like me!). We regularly update the master list so that we can be responsive to any user who wants their data deleted.


GDPR. Before we start patting ourselves on the back too vigorously, I want to point out that much of the Omnicept privacy and security framework is drawn from the European Union’s General Data Protection Regulations (GDPR), to which all companies doing business in the EU adhere. These include informed consent, de-identification and encryption of data, and the right to have user data corrected, retracted, or deleted. What’s different about HP’s stance on data privacy is the willingness to put our policies right out in the open, to be honest about what we are doing, why we are doing it, and to engage in meaningful dialogue with users and partners. HP is committed to protecting user data and we expect our external partners to follow our lead. HP’s external partners sign and agree to HP’s privacy terms and conditions and we will continue to work with them to refine their (and our) data protection best practices. We want to highlight the user protections in GDPR, not circumvent them in complicated end-user license agreements.


Secure Data Storage. Our focus on user privacy and security is not limited to the data we collect internally. The headset also has privacy controls incorporated into the application that enables users to turn off sensors in the headset. No data is stored on the headset and HP does not know who an end user is. Any de-identified data that HP receives from our partners will be used for product improvement (like improving the accuracy of our machine learning models) and will conform to out strict consent and deidentification procedure. Our data is processed locally on-site and deidentified in the cloud. Deidentified data is stored in a secure cloud and erased yearly.


Biometric Data Aggregation. The Omnicept platform offers incredible opportunities to better understand the bodies and minds of users in VR and we can’t wait to see what amazing things our partners will do with those insights. But there is real danger in misunderstanding or mishandling biometric data, particularly cardiovascular data, and we take that concern very seriously. We have added an additional layer of protection and security for end users, over and above traditional data security measures. Instead of reporting raw data from the PPG sensor, the Omnicept SDK aggregates PPG data, reporting only heartrate and pulse rate variability, but not the beat-by-beat cardiac trace. We want to protect end users from sharing sensitive information about their health whenever possible and this is one important way to do that.


For us, data privacy is not a license agreement or a policy sheet. Privacy standards are evolving, and we are committed to evolving with them. Over the coming months, we will continue to add privacy features based on feedback from partners, users, and thought leaders in AI and VR.


We want to get privacy right and we want to help you get it right too. Keep an eye on the developer portal where we will continue to post explainers, FAQs, guidance, templates, and more.



Author : erika.siegel