Is It The Technology, or Us?
280 characters doesn't cut it when a twitterX post hypes privacy concerns
Responding to concerns expressed on twitterX about in-vehicle videos inappropriately viewed and shared by company staff, I was moved by the force of the attack on technology, the camera, and lack of comment on the behaviour of those laughing at scenes—intimate and otherwise—on video.
I suggested that the issue is as much one of workplace and societal culture as it is one of privacy.
Oh my. Woe betide the twitterX adventurer.
“How do you reconcile being a ‘privacy advocate’ and then put the onus on the consumer?!” This tweet, at least, was worth a response. But not in 280 characters.
That, say I, is a very, very big topic—bigger than this piece. And yes, I do see myself as a privacy advocate through knowledge system design and contributions to the field of recorded information management.
Many illusions shape our view on what is acceptable when it comes to personal privacy. Little knowledge shapes the average grasp on how information systems work. That has been true of paper-based systems as much as electronic ones—but that is for another day.
In relation to the case in point, an in-cabin camera turned on by a user-controlled setting is conveyed to staff monitoring car performance (another user-controlled setting). Both the interior and exterior cameras, also controlled by user settings, are a security feature.
The car is watching. Because you told it to.
It can be a relief to be able to check on your car's surroundings when ready to leave on a night out. It is even more reassuring to know that if the driver is nodding off, the car will deliver auditory and visual stimulae to wake that driver up.
Evidently the culture in at least one workplace (and let’s not be naive about how many) enables employees who get a kick out of watching customer videos—and sharing them. Does the risk arise from the camera, or from the employees and the corporation, or us?
All of the above, of course. Let’s ponder trade-offs more broadly.
Some argue that informed consent covers most issues. But, what is informed consent? If we mean that kind of consent granted prior to an operation with implications we don’t understand in an immediate or an ongoing sense, then it should be noted that this is all about limiting corporate liability. It is not about ensuring that your consent is grounded in knowledge sufficient to weigh alternatives. Often the alternative is grim. You can choose treatment—while signing away rights of redress—or can choose to remain untreated. Congratulations—your consent, your fault, whatever happens.
We can similarly take cold comfort from opting out of the security features of a vehicle if we prefer not to “consent” to the employees sharing our images for laughs. These are no-win quandaries.
Maybe privacy is something more than whether or not your car has a camera—or your house has windows.
An organizational culture that embodies respect for privacy as a human right, along with system design that de-identifies persons as soon as possible, is better privacy protection than a signed waiver. And it takes ongoing attention.
Defining the balance between privacy and security should not be left to those whose interest lies in exploiting our privacy. Arguably, that’s where it sits now in the realm of social media.
In-built logic to manage metadata can safeguard our privacy with accountability. I do not argue for governmental control here. Rather, I support a standards regime that, combined with the market, empowers consumers to drive practice toward responsible use of data. Government has a role in articulating the desired outcome. The market has a role in innovating means to achieve that outcome. You and I have a role in understanding the choices we have and the implications of decisions we make.
It really is up to all of us to learn more about how the systems we live within work. If we do not, we become compliant objects in system design without any sense of that in reality. At a basic level, it’s up to us to understand and properly manage our device settings: the first line of defence.
Kids need to learn how information systems work and how their design influences the “knowledge” they think they have gained. That is vital, a foundation for life now incorporated into the Finnish school system1.
It’s not too late for us elders, either. We all need to be aware of what ethics and integrity are and how to discern the patterns that reveal when decision makers veer from the path. Good practices can be rewarded; bad practices exposed— and shut down.
It is a cultural shift, long overdue.
Digital Literacy in Finnish Education: A Model for the World, Finland Education Hub 2023