I don't know if you have been following the Joe Rogan saga and the calls for Spotify to cancel his 100M dollar contract and de-platform him because of his views and the views of some of his guests on his podcast, but it has been somewhat interesting to see how it has played out so far.
For those that don't know, YouTube took certain podcasts of Rogan's down after "public outcry" because the highly experienced guests he had were skeptical of various aspects of the Covid-19 vaccinations, their effectiveness and their need in general, especially for younger and healthy people. Calls from music stars threatening that unless Rogan is removed, they will remove their own content from the platform and "open letters" from doctors have created changes at Spotify.
Firstly, they removed the music of those who gave the ultimatum. Personally, I like this move as it shows that there is an "opt-out" option for musicians, at least those who actually own their music, instead of a music label. Secondly, rather than "cancelling" Rogan, they will be rolling out a "content advisory" function that will warn viewers that there is Covid-related material involved. On top of this, they will build a resource of up-to-date information from "reputable sources" on the matter.
However, from what I was reading there were a few interesting quotes from the founder and CEO of Spotify, Daniel Ek, and Joe Rogan himself.
“Pick almost any issue and you will find people and opinions on either side of it. Personally, there are plenty of individuals and views on Spotify that I disagree with strongly,”
This doesn't mean he has to deplatform them.
“In that role, it is important to me that we don’t take on the position of being content censor, while also making sure there are rules in place and consequences for those who violate them.”
They do have rules, but they are the rules of the platform and it seems that they aren't going to let the public decide what they are, especially around something as contentious and unclear as the defining and handling of Covid-19 and the vaccinations.
Which leads into the quote from Rogan:
“The problem I have with the term misinformation, especially today, is that many of the things that we thought of as misinformation just a short while ago are now accepted as fact. Like for instance, eight months ago, if you said, ‘if you get vaccinated, you can still catch COVID and you can still spread COVID,’ You'd be removed from social media. They would ban you from certain platforms. Now that's accepted as fact. If you said, ‘I don't think cloth masks work,’ you would be banned from social media. Now that's openly and repeatedly stated on CNN. If you said, ‘I think it's possible that COVID-19 came from a lab,’ you'd be banned from many social media platforms. Now that's on the cover of Newsweek”
Interestingly, the article I originally read this quote from, was edited - removing this particular quote.
But, this is not about how you and I feel about vaccinations or whether you like or dislike Rogan, Spotify or anyone else. It is about the processes surrounding information and policy making. As I see it, by letting the "public decide" based on their own fears as to what is true or not, is going to head down a very slippery slope, because they as a mass are fed a centralized narrative that drives that fear, making it not entirely public opinion. This means that we are influenced to demand, but we aren't privy to all of the information that has been omitted.
Transparency of information is welcomed however, but that has to be transparency of all information, not selected information that frames a complex set of circumstances in a particular way to sway public opinion and thereby driving "demands" on platforms. Yes, consumer demand has to be considered of course, but the problem is what inspires that demand. If applied to some other situations for instance, we would say it is fine in some places to stone women to death for adultery, because the local public demands it.
Seems a bit extreme, don't you think? But, that is the problem with culture, as it shapes and reinforces behaviors to work on an unexplored default, rather than actually consider why something is done, or if it should be done at all.
As Rogan said, people have been banned from platforms for spreading misinformation that is now considered as fact. This is much like the Hungarian doctor, Ignaz Semmelweis, who first made a link to germs, before knowing they existed. He asked his staff to wash their hands between doing autopsies and delivering babies. He was ridiculed by his peers.
Semmelweis’s ideas were not accepted by all of his colleagues. Indeed, many were outraged at the suggestion that they were the cause of their patients’ miserable deaths. Consequently, Semmelweis met with enormous resistance and criticism. source
Culture at work.
Just the other day I was talking to a colleague who was extolling the virtues of trust in the media because they print it and are held accountable, how they continually change their story. For example, there are articles from pre-GFC from financial experts that all is well and then, those same news services post-GFC are saying how obvious it all was and we should have seen it coming. Fact checking just doesn't happen anymore, it is all about opinion and getting clicks now.
I am sure that there will be a lot of "debunking the misinformation" content from six months ago that is now debunked itself. Yet, the way it is positioned in the media is going to influence public opinion, even if it is in direct conflict with experience.
Kiss and hug child in the morning. Compulsory to wear your mask to take your child to daycare to stop spread of Corona. Child doesn't wear mask and swaps clothes and food with friends. Wear mask to pick child up. Kiss and hug child goodnight.
Makes sense.
But seemingly unquestioningly, people do it because they have been told to. But, they must have questions, right? So, that means that they do it because of group conformity or fear of social repercussions from their peer group and strangers alike.
I don't know where all of this will go, but I do think that if it continues on the current path, we are not going to like where it ends up. The more that fewer people dictate public opinion, and the more that misguided public opinion sways policy, the worse it is going to get.
Perhaps at some point we will have all relevant information transparent and have AIs trawl the data to build trends out of billions of data points, cross-referencing them all and tempering them with millions of other pieces of information to build a web of trust network that based on the very wide and independent evaluation, we can be reasonably confident in the results. Of course, this can't happen at the moment because we don't even control our own information.
There is a quote by Eleanor Roosevelt:
Great minds discuss ideas; average minds discuss events; small minds discuss people.
What I think has happened online is that we think we are discussing ideas and even events, but what is actually happening is that we are largely discussing people and our opinions of them and most importantly, ourselves. The challenge is that we are looking at complex issues and making decisions on how we feel about them, and we have been driven to feel fear around topics we have little experience or visibility on, and then told who we need to listen to, to be saved. And not surprisingly, there is always an agenda of some kind that is incentivized to omit information to serve its purpose.
When people talk about Joe Rogan spreading misinformation, what they are failing to do is improve information systems so that misinformation isn't possible. But that is a greater problem to solve and our minds are on average, quite small. It is also not in the best interest of those who control information, to solve it.
Taraz
[ Gen1: Hive ]