Maybe that was a bit of a ‘clickbait’ title, since the list of problems with Facebook is epic, tragic, and depressing. So let’s go with, “tonight’s example of an ongoing problem with Facebook”.
One of my biggest gripes about the social media platform is that after all this time, they still do not give us a simple way to view posts chronologically. At some point in the past, they introduced an option to supposedly to that, but it was done via a URL argument and not a user-friendly GUI widget. I’ve used that option to view Facebook to this day, and it is still horrible. Why? Because as you think you finally get the holy grail of simplicity, it is still weighted… just less so. Meaning you are more annoyed when some crappy post pops up four times that day.
OK, so they want weighting and control to deliver the posts your friends make, as they see fit. That means you never see some posts you absolutely want to see, while seeing other posts multiple times a day. Their algorithm has nothing to do with standard weighting, and everything to do with their weird formula that no one can seem to figure out. OK, fine…
Facebook has also been on a tear about ‘honesty’ in the form of user profiles. The last few years have seen nothing but drama and turmoil as Facebook tries to enforce their ‘real name’ policy. A policy that the Chief Product Officer at Facebook apologized for, ensnared a former employee or seven, unfairly targets the LGBT community, and has caused enough headache to warrant a Wikipedia entry. Oh, of course, that the “noble and charitable” Mark Zuckerberg defends. So… integrity and honesty and clarity is important, right?
That sets up the easiest of questions. Why is Facebook targeting their user base, who they profit off, regardless of a real name attached? Sure, they may make a few more pennies on the dollar if a real name is attached over a pseudonym, but still profitable. For years, it let them defend their absurdly high user count on top of the obvious ploys of ignoring idle accounts and such. Now, jump to tonight, which set up a perfect example of where Facebook shows they don’t care. A rather simple example, but one that should be trivial for them to programatically notice and warn against, in a variety of methods. If a single user is posting something that may be fraudulent, contradictory, or a basic scam (e.g. how many times have you been tagged in an image for Oakley sunglasses, even in 2016), why isn’t there a warning? Even when the account isn’t compromised, the user isn’t warned. When the same image of knock-off sunglasses is posted to hundreds of ‘friends’ from a compromised account, it comes with no warning, either from the subject matter, or the break from the normal behavior (e.g. that user with 87 friends tags one photo with 87 names, when never tagging more than 2 people the last 5 years). We’re not talking AlphaGo or Microsoft Tay, we’re talking a couple decades behind them as far as computer intelligence goes. The fact that one was an amazing success while the other was an amazing failure, speaks to my point. They are cutting edge, trying to solve ‘problems’ that are are incredibly complicated. Meanwhile, Facebook can’t figure out what boils down to mid 1990’s email spam patterns, implementing the most basic of statistical filtering.
That said, I would love to see Facebook answer how the following two posts, from the same user, within 40 minutes of each other, could be posted without a warning to them AND me. Compare them posts carefully, not that there is much to go on as far as the end-user sees. At some level, this is stupidly trivial and any half-assed program should notice. No, it isn’t trivial or worth ignoring, that such articles get posted with such discrepancy. That is how we end up with stupid rumors and lies spread around as if they are fact, and fundamentally why our political climate is like it is. When you stop ignoring the details, especially the obvious contradictions, you are buying into a system that doesn’t serve you; rather, one that only exploits you.
One response to “The Problem with Facebook…”
There are many people in this world who equate good and evil, and this has all sorts of long-term consequences. The Facebook thing, like the political thing, both are coupled into this feedback loop. And so are you and I.