After reading Jessica E. Lessin’s “What Critics of the News Industry Get Right — and Wrong” essay, it prompted me to return to the subject of modern journalism and the media companies it’s intrinsically entangled with.

Democratized technology is beneficial to society in that it enables citizen reporting, fast dissemination and syndication of stories, encouraging a community of discussion and commentary, and — essentially — building a voice to the oftentimes voiceless in broader media contexts. It has been discouraging to the practice of journalism, however, as old world news media companies have taken a backseat in the generally recognized idea of news. They instead seemingly have to play on the same service distribution level with any layperson or other potentially “untrusted” institutions on Facebook, Twitter, TikTok, or YouTube. In a similar fate that has been wrought upon restaurants with their use of middle-platforms like DoorDash and UberEats, the direct connection to readers and their attention is now shared and competed for based on the sites, platforms, and apps they’re defaulting to (as opposed to visiting the nytimes.com or a restaurant’s site directly).

This permeates everywhere — what once was curated radio now competes on the same technological battlefield as sophisticated video and audio podcasts, distributed by citizens as well as upcoming media empires. It is generally a good and progressive thing to see this, but all of media is now predicated on limited attention behavior and exploitation of headlines and interfaces to propel engagement. And that is where we are seeing the fabric of journalism unravel, along with the bull rush on misinformation.

Lessin suggests “services should create tools and establish norms for disclosing conflicts of interest and encouraging questions from attendees.” That’s a fine approach — arguably a necessity. Twitter and YouTubr have done uneven jobs of this lately, but they’re on the right path for its inclusion in the platforms and trying to make sense of terms of service expectations so as not single out any specific person or behavior, but to create parameters around which we must use the platforms to their appropriate specs.

We inevitably get into misinformation territory when talking about this subject, and that’s where I have my concerns — not necessarily opposing opinions - to what Lessin states next.

The general idea is that platforms should allow people to get context on what they are hearing from public figures on new platforms.that should be baked into the culture of the products, not included as an after thought.

This isn’t wishful thinking, but it is a byzantine challenge of implementation, because it will now need to be an after thought for primary services and media platforms. And it’s not just about providing context for public figures and what they say — it needs to be equally applied to journalists, opinion columnists, and citizen reporters. In this sense, I don’t see it happening on a scalable way any time soon.

Outrage news, predominant headline narratives, and surgically edited sound/video bites will continue to dominate platforms and media sites because the inherent nature of readers and watchers is, again, predicated on limited attention and infinite scroll exploitation. We still live in a quick bites society (post character limits, meme storytelling, sub-1 minute video/audio clips), oftentimes out of context, and without a pragmatic, scalable ability to clarify any of it. The ability to link to more information (as in, use a sound or video bite as the attention grab, and link to a fuller essay or report) is one way of handling this, but that’s like mixing memes with textbooks — you came for the memes, you don’t want to read a book explaining them.

Lessin ends her essay with a note about how we should be concerned with how “[these services] will be manipulated by a few very powerful people in the future.” We’ve already seen this take place with mass misinformation dissemination in the political universe, but we also see it play out all the time in niche sectors.

Right now we are seeing Facebook sell a narrative about Apple conducting itself in an anti-small business way with an upcoming iOS update that simply prompts its users, upon opening an app, whether you want to allow the app to track you or not. This is a threat to Facebook’s advertising livelihood in how it conducts its business, but the predominant headline narrative is against the gatekeeper in charge of distributing the Facebook app. Alternatively, Apple has created a narrative about being the do-gooder in the tech space and advocating for privacy controls that do not impact its bottom line. Apple (and many privacy-forward internet browsers and content blockers) believe an end-user has the right to choose when they are tracked and how they are tracked. Without studying the intricacies of the situation, which require more than a quick bite of information, a reader may not understand the full context, jump to assumptions based on predominant headline narratives, and move on.

So if we are making an argument about media service companies needing to label and fact check, shouldn’t we apply this same principle to every scenario imaginable that could manipulate or persuade an end-user? Is that possible? Is it the right thing to do? We are entering a slippery slope with attempting to regulate major tech services and platforms, which in one way makes sense only to help clarify concerning headline or misinformation trajectories, but on the other hand, endangers the very democratic principles that allow us to communicate freely and openly with one another, including the evolution of citizen reporting in places normal media and journalist outposts have been unable to cover. A smart servicing layer atop these platforms, with accredited or blessed old/new world media institutions as the resource backend, could help with identifying topics and scaling additional reference material to compliment any kind of posting, sound bites, video reporting, etc. it would require coordination among tech platforms and an agreement to build an approved ecosystem that allows them all to scale it together in a consistent manner.

Without coordination and general guidance among these service platforms, I don’t see the challenge at hand being remedied in a competent way any time soon.