Tuesday, October 30, 2018

OCTOBER 29, 2018 (Eli Pariser 1)

OCTOBER 29, 2018 BLOG POST
The Filter Bubble by Eli Pariser

Pariser's opening chapters seem to be more personal to him, recounting his experience with social media and reflecting on how the internet has become somewhat more "personalized" to his tastes, desires, and interests. This prospect I find absolutely alarming in the same way I find "predictive" texting annoying. It seems to me that computers, or individuals that generate computers and run them behind the scenes, are also influenced by businessmen, but that's not the point I'm chasing here. I'm thinking that, while having a personalized internet experience saves time and truly shows us what we "want" to see, it also limits us to our tastes, desires, and interests. Does this limitation strike anyone else as being somewhat odd? Doesn't being exposed to things we don't care for round us out more effectively than being surrounded by our predilections?



There is a strong possibility that my "GRE brain" is still operating at a high "argumentative" capacity, and I'm tempted to shred any argument that dares stand in my path. But back to Pariser's exploration on the personalized internet experience. Consider what this means for humanity. Reflect for just a second on what predictive functions in computers could mean for us. It means laziness, in short. It means we are given the opportunity to think less because the computer can predict what we would like to say or think or desire next. No need to take the extra step and free-think when "predictive" text already knows what word we're looking for among hundreds. The same goes for what Pariser is examining. Creating a personalized internet experience limits our exigency to think. Am I employing this word correctly? Continuing the project of expanding my vocabulary per the unintentional inspiration of the GRE.

Pariser goes on in Chapter three discussing issues with the filter bubble. He helps readers understand this phenomenon by creating an analogy with Adderall. This almost appears to me to be an examination of priority, in some ways. As in, he'd have us consider that the filter bubble, like Adderall, prevents users from growing distracted by honing in our attention on what could be considered, for the sake of simplicity, important. This sounds like an echo of priority, and the question becomes, what do we deem important? Pariser has an answer with an implication. He argues that either extreme has a poor outcome. Either results in some form of one-track thinking process that cancels out creativity, which in turn, closes down opportunities for the development of fresh, new ideas and insights.

Reflecting back upon internet personalization, I'm thinking about Pariser's terms, that is, the possibility that we should be more concerned with people than with the internet. It's easy to blame the internet for "personalizing" itself to our needs, and stripping us of the ability to think for ourselves, but the truth is that it's only designed that way with economic intent by people. I think Pariser delves into this a little in the last few chapters, skimmed them quickly, drawing from memory. So the problem of trust is not with the personalized internet, but with people, again. Predictable, perhaps. But the internet only serves as a powerful, intelligent functioning software that quickly jumps between extremes depending on who "possess" the software, or has influence over it. This may be leading into a dialogue about artificial intelligence, but I'll spare you reader. Look up Elon Musk and tell my your thoughts. We'll chat about it over coffee, although it may take a while.




3 comments:

  1. Hey Matthew,
    Your question at the beginning is what I'm gonna try to address:

    "Does this limitation strike anyone else as being somewhat odd? Doesn't being exposed to things we don't care for round us out more effectively than being surrounded by our predilections?"

    Yes this limitation strikes me as odd because it feels like the exact opposite should be occurring. It kind of makes me think about a conversation we had in my capstone about originality after reading a book by Joseph Harris called 'Rewriting" and how nothing can ever really be original. That was disturbing to me in the same way that the filter bubble is disturbing.
    Why? Because it means without input from outside sources (other people/the world at large, etc) a person just stagnates, stays the same. Life is ever changing, and I think that bad things happen when we sit still for too long. This also ties into confirmation bias, and the ways a filter bubble makes a person think that their preferences are universal, and should not be challenged— this happens to many people and it's very sad. Myself probably included!

    ReplyDelete
  2. I appreciate in your final comments where you say "we should be more concerned with people than with the internet". It is an interesting point to bring up, especially in a world where our first instinct is to find a blanket term for issues we run into. When we think of the internet as being too personalizing and, if I may, intrusive it is because we demonstrated a need for it. It's easier to blame a larger 'unknown' entity like the internet though.
    I will add that I haaaaate Elon Musk. He is literally a super villain from a 90s kid's movie.

    ReplyDelete
    Replies
    1. You know what I also hate? A site that tells me I'm logged in and then says I'm unknown.
      ~Kas

      Delete