Patatoa

We are the Adults

A coworker and I discussed options on an authentication scheme to use. We needed to make a decision. My knee-jerk response was to ask our boss what he wanted to do. That’s when everything came into focus. Our boss was the COO, his focus is not on the level of one decision for one app. Even then, he wouldn’t know the details of the options as we did. We were the adults. We were the best judges here. In my early thirties I realize that when I log into my bank app, that security wasn’t implemented by some grizzled expert. It was put in place by some thirty-something or younger person that also was following his boss’ incomplete vision.

This realization reflects the reckoning going on with "social media" today. Facebook & Twitter weren’t designed to become the monoliths they are now. When they began, we did not carry the internet with us every where and Facebook & Twitter were just one of several websites we checked. They were made to share thoughts and pictures They were not one-stop-shops and only became so haphazardly. One day, someone’s boss thought it wasn’t enough to be popular. Why should anyone ever click away to another site? So they made it easier to share news articles, sell things, make niche groups. Then they ask why should their web site’s engagement be at the whim of each person’s friends list? So they press their dev team to whip up away to dynamically rank and order posts to elicit more activity. And as these things were mused and implemented, there was no breathing room to stop and think, "Is this what people want?" What are the repercussions of these experiments? Should we present all articles as "news"? Is it healthy to keep feeding people content they did not necessarily subscribe to? Should one site centralize so much of a person’s web activity? There was a two week sprint to put it in place, and no time to raise questions or discover their answers.

Netflix released the documentary The Social Dilemma, and for a week, people considered these issues. Unfortunately, much of the conclusions boiled down to "turn off notifications," which falsely pins the problem on us. This in spite of the inclusion of many executives and developers lamenting their creations Frankly, It was exactly the "we’re sorry" BP mantra from South Park. Any of this could have been avoided had they done a more contained trial run, do a more honest and ethical retrospective, and be more inclusive with the voices allowed to give feedback. "Move fast and break stuff" is supposed to only apply to the project. General social fabric and security shouldn’t be among the breakable.

There are standards for things people use every day. There are standards on the food and medicine we take, on the buildings we work and live in, on the cars we ride in, on the electronics we use, and even most mass media we consume. The reason for that huge qualifier there, there is for some reason an exception for the software we use. Presumably this is because consumer software wasn’t a thing until after deregulation. Other than some industry specific certifications, and literal computer viruses there are no guidelines for user security, privacy, or usability. The developers aren’t cognizant of these things many times because they’re too focused on product and have never had to worry about any of those things. The product owners only care as much as to hide behind a EULA and wash their hands. That is not enough.

We put the onus on the consumer to vet news sources, to reconsider what they share with their friends and family, what apps they install or sites they visit. Consequences are their fault. There’s a cottage industry of companies whose operating principal is to exploit these flawed platforms and their visitors. Extremist news rags, video content farms, identity thieves; sometimes the platforms weed them out if they start targeting too broadly, but sometimes they just happen to be strange bedfellows. Bad actors can be expected with any big platform, but often it’s the poorly-conceived mechanisms implemented in haste are what allows them to be so effective.

I don’t think it would hurt to create a regulation agency to vet technology company’s relationship to their users. Establish what data points are appropriate to collect for different types of users, create minimum security requirements for the data points that are captured. Regulate how that digital data can be related to a real life person. Evaluate what kinds of advertisements can be shown, per situation, and how targeted that can be. Clearly differentiate between vetted news sources and opinion or speculation articles. People in the tech sector may bristle at such a suggestion, but I think these are considerations regular people can get behind. Nothing that I said above limits ingenuity and I think pose good problems to solve creatively. While the tech sector started to self-correct, it is only a start. And far far too late for many.

In the tech field, as in popular culture, the idea of government regulation is unpopular. Apart from the occasional anti-trust suit, and the occasional slap-on-the-wrist of private data being stolen, tech is left to itself while occasionally techno-babbling its way out of consequences. It also blossoms at the expense of the people it invites. How often do we start seeing ads of things we mentioned to our friends or have second thoughts about using a social platform to share pictures of our kids? These platforms are so centralizing that there’s little we can do but just not participate whatsoever. I feel like that also is not the answer. We want to share and connect. That’s the real centralizing force. And because it’s a force so strong, those of us that have any amount of influence need to protect it. I am bringing this mindset to everything I make from now on; to the best of my ability. I urge all developers to do the same. Consider, before it becomes an issue.