4 min read

Rubbing with ambient news and giving up the shovel

To rub against the news — to huff in what Heller calls "ambient information" — means you believe a reality at the surface.
Rubbing with ambient news and giving up the shovel
Photo by Emilie / Unsplash

I'm someone who values facts and research. I like to dig for information. I read tons of news, books, keep an RSS feed, will spend an hour digging through scientific journals; I could spend a day at a museum, library, or bookstore. I value details, the unknown, the blast of truth you can find — a sparkling chain of wonder that erupts from those digs.

Because the truth — the spiked and delectable details — always have felt powerful to me. Like owning a form of power, and given me a stronger context — firmer ground to stand on. But over the last 5 years, 10 years maybe, it's easy to see a pattern where those truthful details matter less. Facts don't change minds, or sway moods. My relationship and value towards information is falling out of style, in that it was ever in style.

In the New Yorker, Nathan Heller delivers a thoughtful thesis for why technology has enabled American's relationship to information to change, and why that relationship to information allowed many Americans to believe in the rhetoric of President-elect Trump: "Pablo Boczkowski has noted that people increasingly take in news by incidental encounter—they are “rubbed by the news”—rather than by seeking it out. Trump has maximized his influence over networks that people rub against, and has filled them with information that, true or not, seems all of a coherent piece."

To rub against the news — to huff in what Heller calls "ambient information" — means you believe a reality at the surface. You see what's in your scroll, in your ear, sound bit, chunked, posted, snipped — you assume the oxygen is fine. There is more assuming than investigating all together.

To build context means to give a subject attention. What Jenny Odell might call a strengthening of a relationship to place and time.

To my dismay, this isn't the default mode for many in how they find information. Information is a living thing, a constant conversation of contributors. Beyond the people in a network, it may be the most important relationship someone has. It creates a world view, a rhetoric. Rhetoric is fundamental to being human. The more diverse and thoughtful — and perhaps slower and attentive — you engage with information, the more robust your rhetoric can become.

But alas, as Heller puts it, ambient information leaking and broadcasting in pockets, TVs, computers, tablets and pads across the nation, surface level info remains the more powerful in shaping understanding.

"Of all the data visualizations that were churned out in the hours following the election, the one that struck me most was a map of the United States, showing whether individual areas had voted to the left or to the right of their positions in the Presidential race in 2020. It looks like a wind map. And it challenges the idea that Trump’s victory in this cycle was broadly issues- or community-based. The red wind extends across farmland and cities, young areas to old, rich areas to poor. It is not the map of communities having their local concerns addressed or not. It’s the map of an entire nation swept by the same ambient premises.
In a country where more than half of adults have literacy below a sixth-grade level, ambient information, however thin and wrong, is more powerful than actual facts. It has been the Democrats’ long-held premise that access to the truth will set the public free. They have corrected misinformation and sought to drop data to individual doors. This year’s contest shows that this premise is wrong. A majority of the American public doesn’t believe information that goes against what it thinks it knows—and a lot of what it thinks it knows originates in the brain of Donald Trump. He has polluted the well of received wisdom and what passes for common sense in America. And, until Democrats, too, figure out how to message ambiently, they’ll find themselves fighting not just a candidate but what the public holds to be self-evident truths."

Even if our technology has forced our political parties into a strategy of ambient information, I still will continue to dig. Even with the rise of chatbots to be shuttlers and keepers of information, I still will dig.

For chatbots eliminate the need to dig, just like ambient information eliminates the need to dig. It takes away the skill of noticing/digging, or building any kind of context. It feels like the opposite to free thinking because the thinking, the thoughtful art of noticing information, is outsourced.

For ambient information is kind of like looking at the top of buildings without understanding what's within them, or down on the ground, or what built them. It's information scaled from it's roots, possibly way far from its sources. Rob Horning offers:

To characterize attention as a ‘vector’ is, for Valéry, to insist that it is by nature pressure, prolongation, effort, conatus – or, to be even more precise, ‘direction of effort’ ... If attention selects, filters or prioritizes, it does so starting from a principle of orientation. Attention cannot be reduced to a simple given, a static number: it is much less (countable) reality than (unpredictable) ‘potential’. In other words, as it relates to thought, attention ‘is always formed in vectoral mode’ (like an arrow), and it is only when it stops to think and develop that it can be grasped ‘in scalar form’ (like a number). 
This scalarization – which is to say, the operation that translates arrows into numbers – denies the fundamental nature of attention, in the same way that putting a bird in a cage denies its nature as a flying creature. But, as we have seen, it is precisely to a ubiquitous scalarization that we are condemned by the financial logic of capitalism.  

Using attention means being uncaged — it means you have a bit more control of how you understand your context. AI might become something of a thought partner in this, but it will still be have limits:

Generative models are ignorant enough, but they only explain contents (with no guarantee of being correct) and convey nothing about the habit and pleasure of noticing, or the process of attentiveness that instrumental to learning. Nothing about their process of generating outputs has anything to do with how humans generate outputs, so they can’t really teach anything, any more than search engine results teach you how to be a researcher. The models can put the content they are tasked with generating into inanely cheerful voices, as with Google’s Notebook LM podcast generator, but this just makes matters worse — it expects us to take pleasure in the simulation of banter rather than in the kindling of the desire to know.
“The best way of showing another how to research is still to research together,” Citton argues. Generative models may get better at simulating that kind of togetherness with chatty interfaces, but they can’t show us how to notice anything.