Pulse en una miniatura para ir a Google Books.
Cargando... God and Golem, Inc.: A Comment on Certain Points where Cybernetics Impinges on Religion (1964)por Norbert Wiener
Big Data (52) Cargando...
Inscríbete en LibraryThing para averiguar si este libro te gustará. Actualmente no hay Conversaciones sobre este libro. In a little Thai Buddhist tract about the Kalama Sutta, how to deal with an overabundance of options, I read a line about computers. That is, though they are powerful, the mind of the designers of the gadgets/tools/programs is that which sets the intention of the device/program's interaction within and between the beings using and/or used by it. Since these minds aren't necessarily held to any standard of enlightenment or even humanistic morals, the tract advises: "computer's shouldn't be worshiped so much." In this book, the only Norbert Wiener text I've read, I was a bit surprised to find a similar sentiment where he criticizes "gadget worshippers" (53) for not taking into account unforseen consequences of technological "magic." Reading it with a critical eye on motivation, as I watch the world be further transformed by tech development, including my home city of Seattle being completely remade into new temples of worship for The Code, its acolytes with their heads in the cloud. He seems to be aware of how AI will probably develop, and possibly is a bit weary of possibilities, but not altogether against them or for them. Instead he points out how the religious (and I'd say, nonsecular humanist) criticism and fear of AI and robotic development is founded, and explains in detail how machines can reproduce themselves. The motive for the creation of machines and AI may not be on the evil side of the dualistic coin. The argument i read in this text is that doomsday weaponry is the real evil and mechanistic complication and development to the point of AI and self-organizing system is a step in evolutionary curiosity and natural human ingenuity. Reading this in the period of ubiquitous computing and Phillip K. Dick shaped robots making jokes about putting humans in zoos, I still think that creation of complex tools and even other beings isn't scary, it's strong vortexes of capital and power that encourages the direction of such development. The vectors of such development are still influenced by corporate agendas, and even though people like Wiener may have chose not to emphasize the control aspect of cybernetics as much as the systemic efficiency, I'm witnessing a magnetism to develop a future where the minds of the few directly affect the lives of the many, and all systems of the earth are put under concrete (for as long as you can keep them down: a biologist friend of mine saw an abandoned highway completely destroyed by trees growing through it because it was easier for them to grow there than in the thicket of roots). I would emphasize that through the other force I'm seeing arising, involving play, creation for curiosity and symbiogenesis of nonanthropocentric life systems, the tools and epistemology of cybernetics can be used to benefit, but if we keep our underlying metaphors of control and military-industrial complex phrasing like Wiener and many authors of programming manuals seem have as their psyhophysical OSes, we're going to fail and make the world even more damaged; I have a hard time seeing any other future within these conditions. Anyways, I'm editorializing. I was actually expecting to read Norbert Wiener as being a way worse person, but besides the uncomfortable references to Hitler and Eichmann and the sketchy red black and white cover inciting paranoia about the fascist ties to futurism, I feel that Wiener was acting based on a type of passionate invention. It also made me wonder how the discussion of mechanical development vs religious/humanistic beliefs has developed in since the 60s when this was written. There seems to be a polarization still, with people mostly leaning towards either technological saviorism or total back-to-nature neopioneerism that I see in America at least, which could ironically destroy the forest just as bad, as Alaska's gentrification and development encouraged by TV shows like "Alaska: The Final Frontier" seem to encourage using the guise of naturalism. A magazine display painted this picture out very nicely to me: the cover of Wired showed Jerry Seinfeld (upper class celeb) wearing Google Glass, propped up on the rack next to American Pioneer magazine with a drawing of the archetype of working class masculinity, a coon-skin hat wearing hunter perfectly capable of self-sustaining in the (hard to find) wilderness. The title of this book implies this continuing conversation: what are the intersections between what may control us and/or influence us, human control and/or influence over other life, and the patterns we base these actions on such as industry and market capitalism or free market open source type development? There are many voices to listen to along these lines, and not one in particular I feel is worthy of worship. It's very strange to read this book and remember it was written more than 50 years ago. It deals with exactly the same questions on the tip of the tongue of society right now. It's full of insightful, but somewhat ignorant ideas (from the perspective of the present day). Apart from a pretty racist little nugget at the very end (I think? It's hard to tell), it's an important book to look back on what the past thought of the future. This short book "is based on lectures given at Yale, at the Societe Philosophique de Royaumont, and elsewhere." It deals with some of the ways we have traditionally defined intelligent life: specifically learning and reproduction. The author demonstrates that, even in 1963, machines could learn to play checkers and improve itself and that we have made machines that are "very well able to make other machines in their own image." His third point, and I think most significant, is that we must be very careful to not push responsibility for our actions onto machines: For a modern example, if a drone kills someone, it is the person or group who operated or programmed the drone who is the killer. There is some very technical discussion that I skimmed. sin reseñas | añadir una reseña
Pertenece a las series editorialesMIT paperback series (42) Premios
En este pequeño ensayo, el famoso matemático Norbert Wiener expone los puntos principales de la cibernética que han de preocupar a las religiones. Por ejemplo, el de la máquina que aprende o el de la máquina que tiene la capacidad de reproducirse, o sea la creencia común de que Dios nos hizo a su semejanza. No se han encontrado descripciones de biblioteca. |
Debates activosNingunoCubiertas populares
Google Books — Cargando... GénerosSistema Decimal Melvil (DDC)006Information Computing and Information Special TopicsClasificación de la Biblioteca del CongresoValoraciónPromedio:
¿Eres tú?Conviértete en un Autor de LibraryThing. |
Reading this book now, more than a half-century after it came out, offers an exciting window into the time when the potential of "learning machines" was just beginning to be evident. One of the examples Wiener uses is a machine programmed to play checkers that had been able to learn from each game it played and had reached the point where it would never lose. His expectation that the same would happen with a chess-playing machine took a bit longer than the ten years he expected, though.
Wiener had presented the ideas in these essays in various settings. His assertion that our relation to machines that learn and reproduce themselves is comparable to the traditional notion of God's relation to man seemed blasphemous to some listeners at the time. For some, the indignation was on religious grounds; for others (biologists), the notion of machines made of inorganic matter could be compared to biological life-forms seemed heretical.
There is little wasted verbiage in this book. At times, it is aphoristic. I liked his reworking of the words of Christ: "Render unto man the things which are man's and unto the computer the things which are the computer's." Of course, this is easier said than put into practice. Many of his innovations led to leaps in productivity, but he was bothered by the human cost of workers made redundant by automation, a problem that remains with us.
I especially enjoyed Wiener's account of an essay, "Science and Society," that he had published in a Soviet academic journal a few years earlier. He seemed amused, but not surprised, that the same number of the journal ran a rebuttal—longer than his essay—from an orthodox Marxist standpoint. He suggested that, had he published it in the West instead, it would have been rebutted in the name of free enterprise. His point had been that science made an important contribution to the homeostasis (balance) of the community, yet its contribution had to be assessed anew every generation or so. He was neither anti-Marxist nor anti-Capitalist, Wiener maintained, but anti-rigidity.
Given that stance, Wiener might be skeptical of a reader coming to his book two generations after publication. But then again, his point was not that we should throw out scientific contributions after twenty-years, but that they should be reassessed. I think a reassessment of Wiener's contribution, on the evidence of this slim book, is that we continue to need scientists who are humane while at the same time unafraid to challenge old orthodoxies. ( )