privacy scale
This commit is contained in:
parent
5883275da0
commit
2817fd5022
2 changed files with 148 additions and 1 deletions
|
@ -2,7 +2,7 @@
|
|||
path: "lorem-ipsum"
|
||||
date: 2020-01-01
|
||||
title: "Lorem Ipsum"
|
||||
|
||||
hidden: true
|
||||
---
|
||||
|
||||
Hey! You've found the secret page that I use to test out style changes.
|
||||
|
|
147
src/notes/2021-02-13-the-privacy-scale.mdx
Normal file
147
src/notes/2021-02-13-the-privacy-scale.mdx
Normal file
|
@ -0,0 +1,147 @@
|
|||
---
|
||||
path: "the-privacy-scale"
|
||||
date: 2021-02-13
|
||||
title: "The Privacy Scale"
|
||||
---
|
||||
|
||||
It's not difficult to imagine that we've already hit some proto-cyberpunk
|
||||
dystopia milestone. Large corporations don't need strictly need your user
|
||||
information these days more. Specifically, they can gain enough information from
|
||||
your actions as much as they do from having what people generally call your
|
||||
personally identifiable data. This is how they skirt around privacy concerns:
|
||||
their definition of personally identifiable information isn't the same as yours.
|
||||
|
||||
I suppose that I should preface this with, _"These are my personal opinions and
|
||||
anything discussed in this article (and like everything on this website) do not
|
||||
reflect any opinion that any employer, past or future, holds."_
|
||||
|
||||
## The cost of storing data
|
||||
|
||||
There are a lot of problems storing traditionally personal user information. Not
|
||||
only does one need to invest into ensuring this data stays secure, but there are
|
||||
regulations whose auditing and violations of incur some risk that companies may
|
||||
not to accept. For larger companies, this is merely a facet of the risk profile
|
||||
they consider, and the penalities are inconsequential. For smaller companies and
|
||||
startups with dreams and ideals, these can be and are usually prohibitive. As a
|
||||
result, the primary cost for companies that _can_ support handling user data is
|
||||
the maintenance cost of (hopefully progressive) security of existing user data.
|
||||
|
||||
Strong encryption and cybersecurity has effectively changed how the cost of user
|
||||
data is evaluated. Not only does it acts as a catch-all for a strong layer of
|
||||
protections for the data companies need to protect, but acts as a way to
|
||||
convince users to trust said companies with their data. It is a panacea:
|
||||
reducing costs while providing immediate benefits their bottom line. Encryption
|
||||
has effectively allowed large corporations to marginalize the cost of user data
|
||||
with the idea that it convinces more users to join. They will even support and
|
||||
promote security standards in a very public manner. In the worst case, they
|
||||
simply obtain positive sentiment. In the best case, it shields them from
|
||||
negative feedback. And in any case, it further secures their bottom line.
|
||||
|
||||
Despite this, the most ironic part of all of this is that this _is_ strictly
|
||||
positive for users. There is no valid reason for services to not provide some
|
||||
layer of encryption to their data. Even improperly protected data provides some
|
||||
benefits (albeit false advertising perhaps outweighs them), if not simply
|
||||
requiring minimal effort from attackers to decrypt the information. I would
|
||||
perhaps call this an instance of [Egoistic Altruism] at best.
|
||||
|
||||
For corporations, this a good solution but a suboptimal one. After all, there
|
||||
still exists a linear cost to storing user data.
|
||||
|
||||
## What you do, not what you are
|
||||
|
||||
A corporation is not interested in you. They don't care about your achievements,
|
||||
your failures, or anything that makes you human. In the most strictest terms,
|
||||
they simply care about what you do. You are not a human; you are a [black box]
|
||||
where if they can predict what you as a black box will do, they won't need your
|
||||
personal identifiable information at all. It starts with simple patterns and
|
||||
correlations: Buy a pet and you now need pet food. Like food and you will enjoy
|
||||
food related products more. They then build superpatterns, patterns based on
|
||||
simplistic patterns, where patterns themselves are inputs and are used to make
|
||||
further predictions. When these patterns are wrong, then they can be readjusted
|
||||
and refined to greater accuracy. This is how machine learning works at a very,
|
||||
very broad and layman scale.[^1]
|
||||
|
||||
Notice how I haven't mentioned any identifiable information at all. Corporations
|
||||
don't care who you are, they care about what you do. They don't care that you
|
||||
got pregnant, they care about what you do now that you are. This is the data
|
||||
they're collecting, and this is how they can avoid the fundamental problem of
|
||||
maintaining anonymity in their data. So long as they treat you as a black box
|
||||
rather than as a person, they can avoid storing information that makes you
|
||||
fundamentally you.[^2] This is the next step, and often the fundamental
|
||||
misunderstanding users have when they hear corporations are collecting their
|
||||
data. So long as you're tied to some ID, you can reveal as little as you like
|
||||
and they'll still find ways to utilize what you do or don't do—doing nothing is
|
||||
doing something after all.
|
||||
|
||||
We're at the stage where corporations no longer need to who you are to
|
||||
influence what you do.
|
||||
|
||||
## Decentralization and federation
|
||||
|
||||
Recently, federation has been popularized not only as it directly opposes the
|
||||
internet-centralizing corporations, but also as a method as a way to bypass
|
||||
censorship. I personally view this is an attempt to return to the ideals
|
||||
the internet strived for in its early stages, but unfortunately that's beside
|
||||
the point.[^3] For our purposes, federation directly opposes this behavior
|
||||
collection by redirecting where your behaviors are to a domain outside the
|
||||
corporation's control. This is not without its own set of problems, however.
|
||||
|
||||
### Digital Distance
|
||||
|
||||
There are unfortunately social costs that may make federation prohibitive to
|
||||
potential new users. Not only do the corporations have first-mover advantage,
|
||||
often your membership depends on who you frequently contact. It's very common
|
||||
to see users support some new technology only to not use it when the time comes,
|
||||
simply because they wish to remain "digitally close" and thus stay on the same
|
||||
platform they previously were on. The cognitive cost of accepting of a new
|
||||
service _and_ convincing others to follow aptly serves as the primary example
|
||||
for non-adoption.
|
||||
|
||||
Suppose you and your friends are, however, willing to switch. It's reasonable to
|
||||
assume that at some point, one of you or your peers will then suggest going all
|
||||
the way to create your own instance and join the federation with just a smaller
|
||||
community. After all, controlling your own domain is the natural end to
|
||||
federation, and the natural extension to joining one is to create one yourself.
|
||||
So you do so, but then a thought occurs, _"Why doesn't everyone host their own
|
||||
instance to federate? Then, everyone has control of their own domain, removing
|
||||
the trust necessary for anyone to control data."_ This seems like an excellent
|
||||
idea, as now every person now controls their domain to their behaviors.
|
||||
|
||||
### Single-user instances
|
||||
|
||||
These single-user instances are the most paradoxical for federation privacy.
|
||||
While it does allow you to control a domain, it does so in the way that is most
|
||||
counter-productive to protecting your privacy. **You have, in essence, created
|
||||
a unique ID for yourself, across the entire federation.** The nature of
|
||||
federation means others must necessarily act and react on your behaviors, and
|
||||
by being the sole inhabitant of a unique domain, you have provided a very easy
|
||||
way to both identify you and your actions. While yes, you have full control of
|
||||
where you do you, you have no longer solved your original problem at all.
|
||||
|
||||
## Balance
|
||||
|
||||
Ultimately, this touches on anonymity and privacy in numbers. Some
|
||||
centralization is centralization is necessary, and too much or too little is
|
||||
problematic. Users must ultimately give up control of their own domain to
|
||||
control their privacy, or disengage from the digital world entirely, or risk
|
||||
coupling some identification to their behaviors... so long as identification is
|
||||
necessary.
|
||||
|
||||
Perhaps image boards did it best. Humans can be social without needing some way
|
||||
to tie a person to their behaviors, but I don't think society will move towards
|
||||
this. People are inevitably want to be prideful who they are,[^4] so the cost of
|
||||
anonymity may be too great for us to handle.
|
||||
|
||||
But hope is free, and so I will hope.
|
||||
|
||||
[Egoistic Altruism]: https://www.youtube.com/watch?v=rvskMHn0sqQ
|
||||
[black box]: https://en.wikipedia.org/wiki/Black_box
|
||||
[^1]: Of course, with broad strokes come broad inaccuracies, but this is
|
||||
sufficient for our needs.
|
||||
[^2]: With regards to your identifiable information. Behavioralists here will
|
||||
argue what you do _is_ who you are. This is a fair point, but it's not difficult
|
||||
for corporations to selectively choose behaviors that prevent unique
|
||||
identification about you, making your behaviors anonymous as well.
|
||||
[^3]: In the most apolitical sense. To quote John Gilmore, _"The Net treats
|
||||
censorship as a defect and routes around it."_
|
||||
[^4]: You can decide if this is self-reflection or irony.
|
Loading…
Reference in a new issue