Pagina principaleGruppiConversazioniAltroStatistiche
Cerca nel Sito
Questo sito utilizza i cookies per fornire i nostri servizi, per migliorare le prestazioni, per analisi, e (per gli utenti che accedono senza fare login) per la pubblicità. Usando LibraryThing confermi di aver letto e capito le nostre condizioni di servizio e la politica sulla privacy. Il tuo uso del sito e dei servizi è soggetto a tali politiche e condizioni.

Risultati da Google Ricerca Libri

Fai clic su di un'immagine per andare a Google Ricerca Libri.

Sto caricando le informazioni...

New Dark Age: Technology and the End of the Future (2018)

di James Bridle

UtentiRecensioniPopolaritàMedia votiConversazioni
4081061,724 (4)Nessuno
As the world around us increases in technological complexity, our understanding of it diminishes. Underlying this trend is a single idea: the belief that our existence is understandable through computation, and more data is enough to help us build a better world. In reality, we are lost in a sea of information, increasingly divided by fundamentalism, simplistic narratives, conspiracy theories, and post-factual politics. Meanwhile, those in power use our lack of understanding to further their own interests. Despite the apparent accessibility of information, we're living in a new Dark Age. From rogue financial systems to shopping algorithms, from artificial intelligence to state secrecy, we no longer understand how our world is governed or presented to us. The media is filled with unverifiable speculation, much of it generated by anonymous software, while companies dominate their employees through surveillance and the threat of automation. In his brilliant new work, leading artist and writer James Bridle surveys the history of art, technology, and information systems, and reveals the dark clouds that gather over our dreams of the digital sublime.… (altro)
Sto caricando le informazioni...

Iscriviti per consentire a LibraryThing di scoprire se ti piacerà questo libro.

Attualmente non vi sono conversazioni su questo libro.

Un libro del 2018 che parla dell'impatto della tecnologia nelle nostre vite rischia, se letto già solo a 4 anni di distanza dalla pubblicazione (come io ho fatto), di risultare datato. Così non è, al netto delle conseguenze della pandemia, che hanno sicuramente acuito alcuni dei fenomeni qui descritti. Si tratta di un libro "serio" in cui, mi tocca dirlo, l'impronta di un intellettuale europeo rende questo libro molto diverso da come sarebbe stato se scritto da un americano. È un lavoro denso, viene da dire "completo" che descrive come "era oscura" (con riferimento a Virginia Woolf) l'epoca che viviamo e all'interno della quale dobbiamo imparare a stare navigando nella complessità ed evitando le derive riduzioniste che purtroppo questo nuovo "mondo degli algoritmi" favorisce. Tanti spunti utili per mantenere alta la guardia e l'attenzione. ( )
  d.v. | May 16, 2023 |
This book is very descriptive, with Bridle offering a survey of what he considers to be eminent problems, and where they originated. Descriptive because there are points where Bridle attempts to be prescriptive, but it never really goes beyond a plea to "think more." He wants us to do a lot of thinking. I don't think this is wrong, but asking humans to think and act collectively is usually a bold strategy. And while I wasn't expecting a chapter of highly prescriptive, economically informed advice for how exactly to navigate these crises, ending it with "we just need to think" is very weak, and seems almost like a punchline given everything the book just explored in detail.

Technology, especially powerful and expansive technology, is usually assimilated into government agencies and programs before it sees a consumer market. Or technology develops in the confines of somewhere like Silicon Valley, where Google campuses are about as confidential as government agencies. While it'd be nice if every time a novel or innovative technology emerged on the market (GPS, image recognition, deepfakes, etc.), the public was allowed to chew on its societal & political consequences before giving it the greenlight, that just doesn't happen. The most influential technologies are developed, released and integrated without any consumer consent. Data warehouses are built all the time, I have no idea where or when. Fiber-optic cables are constructed all the time, I have no idea where or when. I have next to no influence on this infrastructure, just like I have next to no influence on whether apps track my data (which could be my face, my voice, my location, my habits, etc.).

Bridle's main argument is that the integration of these various technologies have yielded a unique form of thinking: "computational thinking." It's the cousin of solutionism; however, where solutionism believes there is a (frequently market-based) solution to every problem, computational thinking believes every problem is computable. If the problem is apparent but the solution unknown, computational thinking presumes gathering immense data on the problem will naturally reveal its solution. There is one reassuring portion of this constant data harvest: while the NSA does track almost every predictable datum of my life, there's almost nothing substantial to be done with it. Like Synecdoche, New York, where the playwright seeks to reflect every minute aspect of life & struggle, the performance becomes so maximalist as to be impossible to watch. There's so much concurrent information, one would need to be a god to truly witness and make use of it. So while the NSA collects inconceivably large amounts of data, there is no human interpreting and acting on it. Most of our data are likely never even seen except by a machine. However, this is where it becomes scary again. If ever we decide to leverage a machine to interpret these data, we allow the human faculties for ambiguity, uncertainty and nuance to be shoved aside in favor of "efficient" and "unbiased" computational justice.

This is one solution Bridle offers that has more material basis in reality. He uses the Google Optometrist as a paragon for how we can use computers to process prodigious data, while retaining our role as the arbiter for its application. Essentially, there is an AI known as the Optometrist, because it processes multiple reactions and combinations of data, then presents the outcomes to a human viewer, who can decide on its validity or utility. Like an eye doctor, "Which looks better? One, or two?" The doctor does not tyrannically decide for you, and send you scurrying out the door with a bad prescription. While the consumer will maybe never have an opportunity to play arbiter, those working in technology can, and this is partially what Bridle means when he claims we must think more.

Without cataclysmic collapse, we are well beyond the point of Luddite solutions, where we joyously throw our phones into a fire and live off the land in utopian communes. Technology is so intrinsic to our most quotidian functions, its collapse would entail disaster. Banks would cease to function, GPS units would fail, contact in the event of these failures would prove incredibly difficult. This forces us into a position where we must learn to think with machines, as opposed to letting machines think and do for us.

It's very difficult to not imagine a scenario, however, where novel technology isn't swiped by hedge funds and governments to pry economic and societal schisms further and further, intentionally or not. How much liberty do lay folks have over technology beyond their comprehension, and hidden from their scrutiny? ( )
  MilksopQuidnunc | Sep 26, 2021 |
Cuanto más aumenta la complejidad del mundo tecnológico, más disminuye nuestra comprensión de la realidad: la información que recibimos a diario está plagada de datos no contrastados, de posverdad, de teorías conspirativas...

Todo esto nos convierte, cada vez más, en náufragos perdidos en un mar de especulación. James Bridle, el mediático tecnólogo y autor de estas páginas, nos advierte ante un futuro en el que la promesa contemporánea de un conocimiento brindado por la tecnología puede traernos justo lo contrario: una era de incertidumbre, algoritmos predictivos y minuciosos sistemas de vigilancia.

Un libro magistral y aterrador que nos adentra en la inquietante tormenta que acecha el debate de las maravillas del mundo digital.
  bibliotecayamaguchi | Jun 9, 2020 |
sweating sad guy emoji ( )
  uncleflannery | May 16, 2020 |
I wanted to find out what Bridle had to say because I've been calling the rightwing draconian control backwards trends in the US the "New Dark Ages" for years now. This took a bit to work into...the read is easy, but Bridle was inconsistent, exaggerative and repetitive. Still, what he has to say is scary. Bridle opens with
‘If only technology could invent some way of getting in touch with you in an emergency,’ said my computer, repeatedly.
Following the 2016 US election result, along with several other people I know and perhaps prompted by the hive mind of social media, I started re-watching The West Wing: an exercise in hopeless nostalgia. It didn’t help, but I got into the habit, when alone, of watching an episode or two in the evenings, after work, or on planes. After reading the latest apocalyptic research papers on climate change, total surveillance, and the uncertainties of the global political situation, a little neoliberal chamber play from the noughties wasn’t the worst thing to sink into.
And we end with a message that technology is bad; no wait! it's good; no...bad; so bad as to be really bad. And it is. But we can't avoid it. Nor can we control it. The genie's bottle is opened, Pandora's box has let loose the demons, and maybe Bridle isn't exaggerating.

Bridle's chapter titles alliterate with the letter "C": Chasm, Computation, Climate, Calculation, Complexity, Cognition, Complicity, Conspiracy, Concurrency, Cloud. "Cloud" plays an early part because it is innocuous, clouds are ephemeral, insubstantial, but the cloud is anything but. It's "in the cloud". Safe, right?
The cloud is a new kind of industry, and a hungry one. The cloud doesn’t just have a shadow; it has a footprint. Absorbed into the cloud are many of the previously weighty edifices of the civic sphere: the places where we shop, bank, socialise, borrow books, and vote. Thus obscured, they are rendered less visible and less amenable to critique, investigation, preservation and regulation.
That is the Chasm.
And so we find ourselves today connected to vast repositories of knowledge, and yet we have not learned to think. In fact, the opposite is true: that which was intended to enlighten the world in practice darkens it. The abundance of information and the plurality of worldviews now accessible to us through the internet are not producing a coherent consensus reality, but one riven by fundamentalist insistence on simplistic narratives, conspiracy theories, and post-factual politics.
Bridle observes: "Automation bias ensures that we value automated information more highly than our own experiences, even when it conflicts with other observations – particularly when those observations are ambiguous." We are reliant on technology because it has to be better than humans, right? Lewis Fry Richardson wrote, "Einstein has somewhere remarked that he was guided towards his discoveries by the notion that the important laws of physics were really simple. R.H. Fowler has been heard to remark that, of two formulae, the more elegant is the more likely to be true. Dirac sought an explanation alternative to that of spin in the electron because he felt that Nature could not have arranged it in so complicated a way." Richardson's studies on the ‘coastline paradox’ (correlation between the probability of two nations going to war and the length of their shared border was a problem as length of the border depended upon the tools used to measure it) came to be known as the Richardson effect, and formed the basis for Benoît Mandelbrot’s work on fractals. It demonstrates, with radical clarity, the counterintuitive premise of the new dark age: the more obsessively we attempt to compute the world, the more unknowably complex it appears." But that paradox isn't diminished with more data... rather, worsened.
But, Bridle says
In a 2008 article in Wired magazine entitled ‘End of Theory’, Chris Anderson argued that the vast amounts of data now available to researchers made the traditional scientific process obsolete. No longer would they need to build models of the world and test them against sampled data. Instead, the complexities of huge and totalising data sets would be processed by immense computing clusters to produce truth itself: ‘With enough data, the numbers speak for themselves.’
And then he seems to contradict himself: "This is the magic of big data. You don’t really need to know or understand anything about what you’re studying; you can simply place all of your faith in the emergent truth of digital information." Uh, technology good?
He observes
Since the 1950s, economists have believed that in advanced economies, economic growth reduces the income disparity between rich and poor. Known as the Kuznets curve, after its Nobel Prize–winning inventor, this doctrine claims that economic inequality first increases as societies industrialise, but then decreases as mass education levels the playing field and results in wider political participation. And so it played out – at least in the West – for much of the twentieth century. But we are no longer in the industrial age, and, according to Piketty, any belief that technological progress will lead to ‘the triumph of human capital over financial capital and real estate, capable managers over fat cat stockholders, and skill over nepotism’ is ‘largely illusory’
True sense there...we are no longer "industrial" and the models don't play right anymore. "Technology, despite its Epimethean and Promethean claims, reflects the actual world, not an ideal one. When it crashes, we are capable of thinking clearly; when it is cloudy, we apprehend the cloudiness of the world. Technology, while it often appears as opaque complexity, is in fact attempting to communicate the state of reality. Complexity is not a condition to be tamed, but a lesson to be learned." There's that cloud again. Technology manipulates. Surely you aren't naive to think that savvy technologists are not manipulating the "free" market? Giant drops in a stock exchange erased in seconds?

Facial recognition builds in biases; police "Minority Report" crime predictive softwares are inherently biased; algorithms that feed us "news", shopping, "answers" are all manipulative and we let them because we have no choice. Snowden shows us that we're spied upon, too late - the damage is done. Uber manipulates its employees to resist unionization/organization. Amazon's brutal employee relationships are hidden to the public because we want the benefits of technology: on my doorstep tomorrow? Sweet!

And then there are the conspiracy theorists who may be on to something, if in a completely lunatic way, that technology is the devil. "Conspiracy theories are the extreme resort of the powerless, imagining what it would be to be powerful. This theme was taken up by Fredric Jameson, when he wrote that conspiracy ‘is the poor person’s cognitive mapping in the postmodern age; it is the degraded figure of the total logic of late capital, a desperate attempt to represent the latter’s system, whose failure is marked by its slippage into sheer theme and content’. [...] the individual, however outraged, resorts to ever more simplistic narratives in order to regain some control over the situation." People buy into crap because they don't want to, or can't, do the heavy thinking.So technology wins by forfeit.
Russia didn't start with us - "In trying to support Putin’s party in Russia, and to smear opponents in countries like Ukraine, the troll farms quickly learned that no matter how many posts and comments they produced, it was pretty hard to convince people to change their minds on any given subject." They just got better when it really mattered:
And so they started doing the next best thing: clouding the argument. In the US election, Russian trolls posted in support of Clinton, Sanders, Romney, and Trump, just as Russian security agencies seem to have had a hand in leaks against both sides. The result is that first the internet, and then the wider political discourse, becomes tainted and polarised. As one Russian activist described it, ‘The point is to spoil it, to create the atmosphere of hate, to make it so stinky that normal people won’t want to touch it.’
Overload the data. Cloud the system.

Then the proponents, like Google’s own CEO Eric Schmidt said "‘I think we’re missing something,’ he said, ‘maybe because of the way our politics works, maybe because of the way the media works. We’re not optimistic enough … The nature of innovation, the things that are going on both at Google and globally are pretty positive for humankind and we should be much more optimistic about what’s going to happen going forward.’" So, for them that control, technology is good, right?

The dystopias of Ghost in the Machine, Blade Runner/Electric Sheep, Gibson's Neuromancer are closer than we think. ( )
1 vota Razinha | Mar 27, 2019 |
nessuna recensione | aggiungi una recensione

Appartiene alle Collane Editoriali

Devi effettuare l'accesso per contribuire alle Informazioni generali.
Per maggiori spiegazioni, vedi la pagina di aiuto delle informazioni generali.
Titolo canonico
Titolo originale
Titoli alternativi
Data della prima edizione
Personaggi
Luoghi significativi
Eventi significativi
Film correlati
Epigrafe
Dedica
Incipit
Citazioni
Ultime parole
Nota di disambiguazione
Redattore editoriale
Elogi
Lingua originale
DDC/MDS Canonico
LCC canonico

Risorse esterne che parlano di questo libro

Wikipedia in inglese

Nessuno

As the world around us increases in technological complexity, our understanding of it diminishes. Underlying this trend is a single idea: the belief that our existence is understandable through computation, and more data is enough to help us build a better world. In reality, we are lost in a sea of information, increasingly divided by fundamentalism, simplistic narratives, conspiracy theories, and post-factual politics. Meanwhile, those in power use our lack of understanding to further their own interests. Despite the apparent accessibility of information, we're living in a new Dark Age. From rogue financial systems to shopping algorithms, from artificial intelligence to state secrecy, we no longer understand how our world is governed or presented to us. The media is filled with unverifiable speculation, much of it generated by anonymous software, while companies dominate their employees through surveillance and the threat of automation. In his brilliant new work, leading artist and writer James Bridle surveys the history of art, technology, and information systems, and reveals the dark clouds that gather over our dreams of the digital sublime.

Non sono state trovate descrizioni di biblioteche

Descrizione del libro
Riassunto haiku

Discussioni correnti

Nessuno

Copertine popolari

Link rapidi

Voto

Media: (4)
0.5
1
1.5
2 2
2.5
3 8
3.5 3
4 16
4.5 1
5 13

Sei tu?

Diventa un autore di LibraryThing.

 

A proposito di | Contatto | LibraryThing.com | Privacy/Condizioni d'uso | Guida/FAQ | Blog | Negozio | APIs | TinyCat | Biblioteche di personaggi celebri | Recensori in anteprima | Informazioni generali | 204,452,477 libri! | Barra superiore: Sempre visibile