Thank you for sharing the article, Jon.
I agree on your points on Facebook. Though I still use Facebook, it is more of a observational standpoint now. It is interesting to see the developments of news and opinions through the lens of Facebook. For example, I find it interesting how certain population of my friends on Facebook have specific viewpoints that strongly go against other segments of my friends (i.e. left vs. right wing political viewpoints). There doesn't seem to be any middle ground for the most part; you are either on one side or the other. Moderates are drowned out. Its the formation of social media bubbles.
I find Reddit running into this issue now too, as people seem to only congregate into their favourite subreddits. Subreddits are like specific factions now in a sense, as sometimes these subreddits 'brigade' other subreddits. There is more polarization, too.
Interesting times we are in.
Cheers,
Viet
One thing that bothers me a bit about TBL's stance is that such clustering will occur no matter how open and distributed the network - it's built into the very nature of the web that he designed. Subreddits are functionally equivalent to websites or, for that matter, newsgroups of old in that regard. When we get to choose what we see and who we interact with, echo chambers are inevitable. Nothing new here apart from scale and the length of the long tail.
Filter bubbles, on the other hand, are far from inevitable. Where I think he is right to be worried is that the perspective you see on Facebook and, in principle, might see via Google, depending on the degree of 'personalization' (I greatly dislike use of the term to mean something done to somebody rather than by somebody) you let it get away with, is determined not by individuals but by secret algorithms. This means that, especially in the case of Facebook (that knows and ruthlessly exploits the value of polarization in driving engagement) your view of what your 'friends' are saying is not necessarily determined by the natural spread of opinion among them but by the system's deliberate manipulation of what you see which, thanks to FB's largely repulsive algorithms, represents a tiny fraction of the whole.
One part of the answer might lie in making those algorithms scrutable. This opens up a can of worms inasmuch as it makes it easier for evil-doers to exploit them but, much as open source can be more secure than closed source because people can spot and fix the bugs (only works when there is much ongoing active development), it would certainly encourage the perpetrators to tread more carefully. Another part lies in making them not just scrutable but personalizable (in the true sense of the word). Unfortunately, as Judy Kay (and I, for that matter, in a slightly different way) found a decade or two ago, the chances of people even understanding the effects of changes to weightings, let alone actually applying them, are slim. Most will simply accept defaults. The more you force it on them, the more sophisticated the filtering you enable, the less usable (and used) the system will be. There is almost certainly at least a partial answer to this problem - it's a design problem that can in principle be solved by a sufficiently inventive technology - but I've yet to find it.
Another part of the solution - which speaks more to echo chambers than filter bubbles - lies in culture and literacy. It is natural to be drawn to others that share interests or other commonalities with you, and for those that are not part of your sets to be considered as other, with all the bad things that entails. Escaping these traps means making an active decision to do so. If we, as individuals, put greater value on diversity - for instance, we actively seek things that conflict with our views, that are not in our comfort zones, that are not among our self-decided interests, that are randomly chosen, or serendipitous - and try to understand them, then there's a much better chance that the world will become a better place. One of the big reasons that Canada is a better place to live than most others in the world is that, as a culture, we celebrate and embrace such diversity. The same can and should be true of our online lives.
The activity pages show you all the posts that you are allowed to see on the site. This is sometimes referred to as the activity stream or river. It is a great way to keep up to date with what has been posted on the site. You can configure the river to show things that particularly interest you - in your settings you can configure activity tabs to display activities from people in specific groups and your circles.
We welcome comments on public posts from members of the public. Please note, however, that all comments made on public posts must be moderated by their owners before they become visible on the site. The owner of the post (and no one else) has to do that.
If you want the full range of features and you have a login ID, log in using the links at the top of the page or at https://landing.athabascau.ca/login (logins are secure and encrypted)
Posts made here are the responsibility of their owners and may not reflect the views of Athabasca University.