Social media algorithms are quietly molding our thought processes, often knowing us more intimately than we understand ourselves. Jack raises an issue that goes beyond the typical debates around free speech, touching on the erosion of our very free will. He points out how algorithms, which respond to our digital interactions, selectively curate our content feeds, introducing an inherent bias. He is sounding the alarm: Our growing dependency on these algorithms is diminishing our capacity for independent thought and decision-making. Our opinions and perspectives are increasingly shaped subconsciously by the algorithms that think they know us, rather than being the result of our own deliberate thought and creation. In an age where algorithms predetermine our preferences, we must not lose sight of our own agency. Questioning our own perceived beliefs and assumptions is more important now than ever.
Mark Korthuis’ Post
More Relevant Posts
-
Pushing the boundary of social media Shubham Upadhyay asking important questions! So ask yourself, how often do you encounter a viewpoint that truly challenges your thinking? If your answer is 'not often,' then you're likely stuck in an algorithmic echo chamber, a place where opposing views are drowned out by the loud roar of information that simply reinforces what you already believe. This not only leads to a narrow worldview but also contributes to algorithmic information fatigue and cognitive overload. In this era of information abundance, it's time we demand something better. It's time to break free from the echo chambers and navigate the digital seas with clear vision and purpose. It's time for a new approach to how we engage with the digital world.
To view or add a comment, sign in
-
The Reality of Shadow Banning and Censorship: It's Not About Silencing the Wrong Voices—It's About Making You Feel Invisible. Social media was supposed to be the great equalizer—a place where ideas could be shared freely, where everyone had a voice. Yet, for many of us, it has become a platform that’s designed to pull us down, make us feel like nobodies, and create the illusion that what we have to say doesn’t matter. Shadow banning and censorship are more than just ways to silence dissent—they’re tactics to make people feel like they don’t even exist, that no one cares about their thoughts or contributions. But let me tell you, that feeling is completely untrue. Yes, social media algorithms seem to favor narcissistic, self-promoting content over the voices of those who speak with empathy, depth, and truth. Yes, your posts might be buried at the bottom of the feed, while others—who pay for premium or stick to the mainstream—are given a spotlight. And yes, this system can make you feel irrelevant, like your voice is drowned out in the noise. But that doesn’t mean you should stop speaking up. In fact, it’s more important than ever to continue sharing what matters. The fact that these platforms try so hard to push certain voices down should tell you something: what you’re saying has value, and that’s exactly why they’re trying to suppress it. Social media platforms may prioritize those who can pay for visibility, but at the end of the day, truth has a way of reaching people—regardless of algorithmic manipulation. Being shadow banned or censored doesn’t mean your voice isn’t being heard by those who need to hear it. So don’t let these tactics get into your head. You do exist. Your voice does matter. And while the system may try to push you to the margins, your message has the power to resonate far beyond the limits they try to impose. Keep speaking up, because the world needs more authenticity and less of the surface-level noise. https://lnkd.in/gvJTj5cR
To view or add a comment, sign in
-
Check the links below if anyone wants to read the papers I wrote on search. While they are mainly on search, the behavioral aspects also apply to all modern social media companies. Professor Milutinovis and his students visited me and asked if I had any good ideas. I told them about my ideas on behavioral search. His students did the research, and he wrote it up. A note to Googlers: They should look into many ideas that have not been implemented. If you know anyone in the field, you should let them know. https://lnkd.in/gHQaZPRE https://lnkd.in/gFRFwYj9
To view or add a comment, sign in
-
Social media contribute to the spread of misinformation, echo chambers, and tunnel vision. Here's why: 1. Confirmation bias: Social media algorithms often prioritize content that aligns with our existing beliefs, creating an "echo chamber" effect. 2. Selective exposure: We tend to follow and engage with sources that confirm our views, ignoring contradictory perspectives. 3. Information overload: The sheer volume of information on social media can lead to mental shortcuts, causing us to accept information without critical evaluation. 4. Emotion-driven sharing: Emotive content is more likely to be shared, even if it's misleading or inaccurate. 5. Lack of nuance: Complex issues are often oversimplified or reduced to binary choices, neglecting subtleties and context. 6. Authority and credibility: Social media can amplify unqualified or biased sources, giving them undue credibility. 7. Group polarization: Social media can foster groupthink, where individuals conform to dominant views within their online communities. These factors can indeed lead to tunnel vision, where a concocted view is presented as gospel truth, and alternative perspectives are ignored or dismissed. By being aware of these dynamics and taking steps to address them, we can work towards a more informed and inclusive online discourse.
To view or add a comment, sign in
-
A very insightful and well-researched analysis of the current state of Big Tech, in particular social media. It's a long, but very entertaining read. https://lnkd.in/gAs8fdmq
‘Enshittification’ is coming for absolutely everything
ft.com
To view or add a comment, sign in
-
In the vast digital landscape where ideas and influence converge, it is essential to remain vigilant about the ethical implications of our online behaviors. As professionals, our interactions on social media can subtly shape our moral compasses, impacting real-world decision-making. Let's champion a culture of integrity and responsibility in every click and comment, ensuring that our digital footprint aligns with our ethical standards. This image visually represents the powerful influence of digital media on our psyche and moral decisions. The juxtaposition of a human brain with a reaching hand and a smartphone symbolizes the grip that social media can have on our thoughts and actions. It serves as a reminder that while technology connects us, it also holds the potential to lead us astray if we're not cautious about the content we consume and the interactions we engage in. This artwork was created to provoke thought on the impact of our online behavior and to encourage a mindful approach to digital consumption. #DigitalEthics #ProfessionalIntegrity
To view or add a comment, sign in
-
The Digital Web of Deception: How Social Media Fuels Conspiracy Theories How social media algorithms, psychological factors, and misinformation are fueling the spread of conspiracies. https://lnkd.in/eTcn6iC2
The Digital Web of Deception: How Social Media Fuels Conspiracy Theories
frankarriola.substack.com
To view or add a comment, sign in
-
"Packetized media has rapidly become the dominant form of media, although it’s important to note that dominance doesn’t mean replacement since the earlier forms of media don’t disappear; instead, as we’ve seen, these older types will be integrated into the new dominant form. Here’s how this shift happened. --Technological change makes a shift possible (social networking, starting in 2001, was the start of true packetization, and the mass adoption of smartphones made it available to everyone). --A dopaminergic effect drives early adoption -- if you want to fall asleep at night, read a book; if you want to stay awake, scroll X or TikTok. This dopaminergic effect fades as our brains build up a tolerance to it. --Repeated use of a new dominant media forcibly alters how we think and perceive reality—it actively rewires the neurons in our brains (as we saw with the mass adoption of reading during the Reformation). This new method of thinking will inevitably force social reorganization. In most previous cases, this social reorganization is modest (i.e., broadcast media) since the new dominant media is merely a variant of previous dominant forms." https://lnkd.in/gpcFddav
Packetized Media
johnrobb.substack.com
To view or add a comment, sign in
-
Honestly, it feels like social media is tearing our democracies apart. Instead of bringing people together for real conversations, it pushes us into echo chambers where we only see what we already agree with. I mean, it’s wierd how algorithms categorize us, predicting our “most probable states”, like Václav Havel put it, and then feed us content that makes us even more extreme. It’s not just that conspiracy theories are spreading, it’s that we’re being primed to believe them. With every post, every click, it’s like we’re training our brains to eagerly await the next shocking and outrageous statement. And when some turned this into a Wild West of “free speech,” it only got worse: misinformation and toxic content are almost everywhere. The scary thing is that it makes people more divided, more distrustful, and more likely to buy into crazy theories. Democracies are supposed to thrive on debate and understanding, but most social media platforms seem to be doing the opposite. Are we really okay with letting these platforms that are supposed to connect us tear society apart?
To view or add a comment, sign in
-
Digital Literacy will be vital in the near future in order to change the equation of side effects of social media use!
The most interesting thing in tech: New research from Jonathan Haidt and Will Johnson on the social media use of Gen Z. they find that young people use social media all the time; they have to keep up with their friends. But a huge portion of them wish that it had never been invented. They know it's deeply harmful, but they can't be the only one who doesn't use it. This is a collective action problem—and profoundly disheartening.
To view or add a comment, sign in
Strategy First. I brand people, market products, launch ideas, and grow businesses. Trusted to help brands stay relevant. 📍Meet Me: LisaPatrick.ca 📍Meet We: BravuraBranding.com
6moJack highlights a critical issue: the subtle erosion of our free will by social media algorithms that mold our thoughts and decisions. These algorithms curate content based on our digital interactions, introducing biases that shape our perspectives subconsciously. In this context, we absolutely have choice!!!! Most chose not to be actively building and controling our personal brands - this is vital. By deliberately sharing our experiences and knowledge, we utilize the ‘power of our voice to assert our agency’, ensuring our messages reflect our true selves rather than algorithmic predictions. I believe NOW more than ever, we must question our beliefs and assumptions to maintain our capacity for independent thought and decision-making. What do you think Mark?