Origin > Dangerous Futures > Technotopia and the Death of Nature
Permanent link to this article: http://www.kurzweilai.net/meme/frame.html?main=/articles/art0469.html

Printable Version
    Technotopia and the Death of Nature
by   James John Bell

There is something missing from the discussion of the technological singularity, says James Bell: the true cost of progress will mean the unprecedented decline of the planet's inhabitants -- an ever-increasing rate of global extinction, some warn.


There is no question that technological growth trends in science and industry are increasing exponentially. There is, however, a growing debate about what this runaway acceleration of ingenuity may bring. A number of respected scientists and futurists now are predicting that technological progress is driving the world toward a "Singularity" -- a point at which technology and nature will have become one. At this juncture, the world as we have known it will have gone extinct and new definitions of "life," "nature" and "human" will take hold.

"We are on the edge of change comparable to the rise of human life on Earth," San Diego University Professor of Computer Science Vernor Vinge first warned the scientific community in 1993. "Within 30 years, we will have the technological means to create superhuman intelligence. Shortly after, the human era will end."

Some scientists and philosophers have theorized that the very purpose of life is to bring about the Singularity. While leading technology industries have been aware of the Singularity concept for some time, there are concerns that, if the public understood the full ramifications of the Singularity, they would be reluctant to accept many of the new and untested technologies such as genetically engineered foods, nano-technology and robotics.

Machine Evolution
A number of books on the coming Singularity are in the works and will soon appear. In 2003, the sequel to the blockbuster film The Matrix will delve into the philosophy and origins of Earth's machine-controlled future. Matrix cast members were required to read Wired editor Kevin Kelly's 1994 book Out of Control: The Rise of Neo-biological Civilization. Page one reads, "The realm of the born - all that is nature -- and the realm of the made - all that is humanly constructed -- are becoming one."

Meanwhile, Warner Brothers has embarked on the most expensive film of all time -- a $180 million sequel called Terminator 3: Rise of the Machines. The film is due out in 2003; a good decade before actual machine evolution is predicted to accelerate "out of control," plunging human civilization towards the Singularity.

Central to the workings of the Singularity are a number of "laws" -- one of which is known as Moore's Law. Intel Corp. cofounder Gordon E. Moore noted that the number of transistors that could fit on a single computer chip had doubled every year for six years from the beginnings of integrated circuits in 1959. Moore predicted that the trend would continue, and it has -- although the doubling rate was later adjusted to an 18-month cycle.

Today, millions of circuits are found on a single miniscule computer chip and technological "progress" is accelerating at an exponential rather than a linear growth rate.

Stewart Brand, in his book The Clock of the Long Now, discusses another law -- Monsanto's Law -- which states that the ability to identify and use genetic information doubles every 12 to 24 months. This exponential growth in biological knowledge is transforming agriculture, nutrition and healthcare in the emerging life-sciences industry.

In 2005, IBM plans to introduce "Blue Gene," a computer that can perform one million-billion calculations-per-second -- about 1/20th the power of the human brain. This computer could transmit the entire contents of the Library of Congress in less than two seconds. According to Moore's Law, computer hardware will surpass human brainpower in the first decade of this century. Software that emulates the human mind -- "artificial intelligence" -- may take a few more years to evolve.

Reaching Infinity
The human population also is experiencing tremendous exponential population growth. Dan Eder, a scientist at the Boeing Artificial Intelligence Center, notes that "human population growth over the past 10,000 years has been following a hyperbolic growth trend ... with the asymptote [or the point of near-infinite increase] located in the year 2035 AD." An infinite number of humans is, of course, impossible. Scientists predict our numbers will hover around 9 billion by mid-century.

Eder points out that the predicted rise of artificial intelligence coincides with the asymptote of human population growth. He speculates that artificial life could begin to multiply exponentially once biological life has met its finite limits.

Scientists are debating not so much if it will happen, but what discovery will set off a series of Earth-altering technologic events. They suggest that advancements in the fields of nanotechnology or the discovery of artificial intelligence could usher in the Singularity.

Technologic Globalization
Physicists, mathematicians and scientists like Vernor Vinge and Ray Kurzweil have identified through their accelerated technological change theories the likely boundaries of the Singularity and have predicted with confidence the effects leading up to it over the next couple of decades.

The majority of people closest to these theories and laws -- the tech sector -- can hardly wait for the Singularity to arrive. The true believers call themselves "extropians," "post-humans" and "transhumanists" and are actively organizing not just to bring the Singularity about, but to counter what they call "techno-phobes" and "neo-luddites" -- critics like Greenpeace, Earth First! and the Rainforest Action Network.

The Progress Action Coalition (Pro-Act), which was formed in June 2001, fantasizes about "the dream of true artificial intelligence... adding a new richness to the human landscape never before known." The Pro-Act website features several sections where the strategies and tactics of environmental groups and foundations are targeted for "countering."

Pro-Act, AgBioworld, Biotechnology Progress, Foresight Institute, the Progress Freedom Foundation and other industry groups that desire accelerated scientific progress acknowledge that the greatest threat to technologic progress comes not just from environmental groups, but from a small faction of the scientific community -- where one voice stands out.

The Warning
In April 2000, a wrench was thrown into the arrival of the Singularity by an unlikely source -- Sun Microsystems' Chief Scientist Bill Joy. Joy co-founded Sun Microsystems, helped create the Unix computer operating system and developed the Java and Jini software systems -- systems that helped give the Internet "life."

In a now-infamous cover story in Wired magazine, "Why the Future Doesn't Need Us," Joy warned of the dangers posed by developments in genetics, nanotechnology and robotics. Joy's warning of the impacts of exponential technologic progress run amok gave new credence to the coming Singularity. Unless things change, Joy predicted, "We could be the last generation of humans." Joy has warned that "knowledge alone will enable mass destruction" and termed this phenomenon "knowledge-enabled mass destruction" (KMD).

The Times of London compared Joy's statement to Einstein's 1939 letter to President Roosevelt, which warned of the dangers of the nuclear bomb.

The technologies of the 20th century gave rise to nuclear, biological and chemical (NBC) technologies that, while powerful, require access to vast amounts of raw (and often rare) materials, technical information and large-scale industries. The 21st century technologies of genetics, nanotechnology and robotics (GNR) however, will require neither large facilities nor rare raw materials.

The threat posed by GNR technologies becomes further amplified by the fact that some of these new technologies have been designed to be able to "replicate" -- i.e., they can build new versions of themselves. Nuclear bombs did not sprout more bombs and toxic spills did not grow more spills. If the new self-replicating GNR technologies are released into the environment, they could be nearly impossible to recall or control.

Globalization and Singularity
Joy understands that the greatest dangers we face ultimately stem from a world where global corporations dominate -- a future where much of the world has no voice in how the world is run. The 21st century GNR technologies, he writes, "are being developed almost exclusively by corporate enterprises. We are aggressively pursuing the promises of these new technologies within the now-unchallenged system of global capitalism and its manifold financial incentives and competitive pressures."

Joy believes that the system of global capitalism, combined with our current rate of progress, gives the human race a 30 to 50 percent chance of going extinct around the time the Singularity happens. "Not only are these estimates not encouraging," he adds, "but they do not include the probability of many horrid outcomes that lie short of extinction."

Nobel Prize-winning atmospheric chemist Paul Crutzen contends that if chemists earlier in the last century had decided to use bromine instead of chlorine to produce commercial coolants (a mere quirk of chemistry), the ozone hole over Antarctica would have been far larger, would have lasted all year and would have severely affected life on Earth. "Avoiding that was just luck," stated Crutzen.

It is very likely that scientists and global corporations will miss key developments (or, worse, actively avoid discussion of them). A whole generation of biologists has left the field for the biotech and nanotech labs. As biologist Craig Holdredge, who has followed biotech since its early beginnings in the 1970s, warns: The science of "biology is losing its connection with nature."

Yet there is something missing from this discussion of the technologic singularity. The true cost of technologic progress and the Singularity will mean the unprecedented decline of the planet's inhabitants -- an ever-increasing rate of global extinction.

The World Conservation Union (IUCN), the International Botanical Congress and a majority of the world's biologists believe that a global "mass extinction" already is underway. As a direct result of human activity (resource extraction, industrial agriculture, the introduction of non-native animals and population growth), up to one-fifth of all living species -- mostly in the tropics -- are expected to disappear within 30 years. "The speed at which species are being lost is much faster than any we've seen in the past -- including those related to meteor collisions," University of Tennessee biodiversity expert Daniel Simberloff told the Washington Post.

A 1998 Harris poll of the 5,000 members of the American Institute of Biological Sciences found 70 percent believed that what has been termed "The Sixth Extinction" is now underway. A simultaneous Harris poll found that 60 percent of the public were totally unaware of the impending biological collapse.

At the same time that nature's ancient biological creation is on the decline, artificial laboratory-created bio-tech life forms -- genetically modified tomatoes, genetically engineered salmon, cloned sheep -- are on the rise. Already more than 60 percent of food in US grocery stores contain genetically engineered ingredients -- and that percentage is rising.

Nature and technology are not just evolving: They are competing and combining with one another. Ultimately there could be only one winner.

Resources

James Bell is a writer for Sustain, a national environmental information group based in Chicago.This article is excerpted from his forthcoming book. For more information visit www.technologicalsingularity.info or contact jamesbell@sustainusa.org. An earlier version of this article was published in the Samhain (November/December 2001) issue of the Earth First! Journal. (c) 2001 by James Bell.

Copyright Earth Island Journal

 Join the discussion about this article on Mind·X!

Technotopia: Clones, Supercomputers, & Robots

Hybrid biological nanomachines, clones, implanted microchips, DNA computers, and spy robots are among the signs of the coming Singularity, in which superintelligent machines may take over, some fear. But we can't stop it without a totalitarian, state-enforced ban, says Ray Kurzweil, and what's more, computers will become conscious, feeling beings with rights.

Self-Replicating Atomic-Size Machines
By James Bell

Another cutting-edge field of research with an exponential growth rate is nanotechnology -- the science of building "machines" out of atoms. A nanometer is a distance one-hundredth-thousandth the width of human hair. The goal of this science is to change the atomic fabric of matter -- to engineer "machine-like atomic structures" that reproduce like living matter.

In this respect, it is similar to biotechnology, except that nanotechnology needs to literally create something like the non-organic version of DNA to drive the building of its tiny machines.

As University of Texas Professor Angela Belcher explains, "We're working out the rules of biology in a realm where nature hasn't had the opportunity to work." Belcher is combining genetically modified proteins with semiconductors in the hope of using proteins to do the "building" of the non-living nanostructure. The technique is a hybrid of biotechnology and nanotechnology. What would take millions of years to evolve on its own, "takes about three weeks on the bench top," says Belcher.

Machine progress is knocking down the barriers between all the sciences. Chemists, biologists, engineers and physicists are now finding themselves collaborating on experimental research. This collaboration is best illustrated by the opening of Cornell University's Nanobiotechnological Center and other such facilities around the world. These scientists predict a breakthrough around 2005 to 2015 that will open the way to molecular-size computing -- allowing for exponential technologic progress to race toward infinity.


Signs of the 'Coming Singularity'
By Gar Smith

Some of the scientific "breakthroughs" expected in the next few years promise to make cloning and xenotransplantation (the introduction of genes from one species into another) seem rather benign. At least when scientists plant a spider gene in a goat or insert pig cells into a human brain, they are dealing with all-natural ingredients.

In the Brave New World of the Coming Singularity, the merging of technology and nature has already yielded some disturbing progeny. Consider these examples:

  • Human embryos have been successfully implanted and grown in artificial wombs. (The experiments were halted after a few days to avoid violating in-vitro fertilization regulations.)
  • Researchers in Israel have fashioned a "bio-computer" out of DNA that is capable of handling a billion operations-per-second with 99.8 percent accuracy. Reuters reports that these bio-computers are so minute that "a trillion of them could fit in a test tube."
  • IBM has built a video screen whose images appear so true-to-life that "the human eye finds [the video images] indistinguishable from the real thing."
  • In England, University of Reading Professor Kevin Warwick has implanted microchips in his body to remotely monitor and control his physical motions. During Warwick's Project Cyborg experiments, computers were able to remotely monitor his movements and open doors at his approach.
  • Engineers at the US Sandia National Labs have built a remote-controlled spy robot equipped with a TV scanner, microphone and a chemical micro-sensor. The robot weighs one ounce and is smaller than a dime. Lab scientists predict that the micro-bot could prove invaluable in protecting "US military and economic interests."
  • US scientists have built a machine that, when released into the environment, powers itself by feeding on the bodies of snails and other living creatures.
  • In April 2001, scientists built a robotic fish that was guided by the brain of an eel. The Washington Post heralded the grotesque achievement with the headline: "Scientists Start to Fuse Tissue and Technology in Machines."
  • In February 2001, MIT researchers successfully tested a robotic fish controlled by a microprocessor and powered by the muscle tissues stripped from a frog.


Vernon Vinge's Vision
By Gar Smith

The nonprofit Singularity Institute for Artificial Intelligence (SIAI) exists "to bring about the Singularity -- the technological creation of greater-than-human intelligence." SIAI believes that the creation of "computer-based 'artificial intelligence' [will]... result in an immediate, worldwide and material improvement to the human condition."

Vernon Vinge, the originator of the Singularity concept, is not so sanguine. "When greater-than-human intelligence drives progress, that progress will be much more rapid," Vinge conceded during his famous 1993 speech at a NASA symposium. "We can solve many problems thousands of times faster than natural selection."

"How bad could the Post-Human era be?" Vinge wondered. "Well... pretty bad. The physical extinction of the human race is one possibility." Another possibility he proposed was that, "given all that such technology can do, perhaps governments would simply decide that they no longer need citizens!

"The first ultra-intelligent machine is the last invention that man need ever make (provided that the machine is docile enough to tell us how to keep it under control)." Because ultra-intelligent machines could design even better machines, Vinge predicted an "intelligence explosion" in the near future.

"A central feature of strongly superhuman entities will likely be their ability to communicate at variable bandwidths -- including ones far higher than speech or written messages."

But there is more to the human being than mere computational intelligence. Without emotion, empathy, compassion and a moral sense, raw intelligence can be a dangerous thing. As Ishi, the last surviving member of California's indigenous Yahi nation observed: "White people are clever, but they are not wise."

When the Singularity occurs, Vinge said, it could happen "in the blink of an eye -- an exponential runaway beyond any hope of control.... It will probably occur faster than any technical revolution seen so far.... Even if all the governments of the world were to understand the threat and be in deadly fear of it," Vinge warned, "progress toward the goal would continue."

"The problem is not simply that the Singularity represents the passing of humankind from center stage," Vinge concluded, "but that it contradicts our most deeply held notions of being.

"There are other paths to superhumanity," Vinge suggested. "Computer networks and human-computer interfaces... could lead to the Singularity." Vinge call this alternative to Artificial Intelligence "Intelligence Amplification."


Ray Kurzweil's Vision
By Gar Smith

Computer pioneer Ray Kurzweil is the winner of the world's largest prize for invention -- the $500,000 Lemelson-MIT Prize -- and was awarded the National Medal of Technology by President Clinton in 2000. He is the author of The Age of Intelligent Machines (MIT Press, 1990) and The Age of Spiritual Machines (Viking 1999). In his latest [forthcoming - Ed.] book, The Singularity Is Near, Kurzweil predicts that the final merging of "biological" and "artificial" intelligence will occur before the end of this century.

"The rate of technical progress is itself accelerating," Kurzweil observes. But while people are quick to recognize this fact, "very few people have really internalized the implications of that prediction." Industrialized societies are now doubling their rate of technological progress every 10 years. That means the 21st century will experience the equivalent of "20,000 years of progress" in a century, Kurzweil notes. "You get to a point where the rate of progress is so fast that it's virtually a rupture in the fabric of human history."

"Human beings get our power from having a hundred trillion inter-neural connections operating simultaneously," Kurzweil writes. We are approaching a time when computers will be programmed to create and build even-more-intelligent systems. At this point, Kurzweil warns, the acceleration of computer intelligence threatens to become exponential. Within 20 years, computers will not merely be super-intelligent, "they will be conscious, feeling beings deserving of the same rights, privileges and consideration people give each other.

"I don't think we can stop it. I think there are profound dangers," Kurzweil states. Still, he doesn't agree with Bill Joy's argument that we must call a halt the acceleration of artificial intelligence systems before we create a system that grows out of human control. "The only way you can stop technology advancement," Kurzweil warns, "would be to have a totalitarian, state-enforced ban."


The Need for a "Green" Singularity
By Gar Smith

We have arrived at one of history's great watersheds," writes Ervin Laszlo in his book, Macroshift: Navigating the Transformation to a Sustainable World (Berrett-Koehler, San Francisco). The civilization of the modern age "is not sustainable; it is destined to disappear."

Laszlo, the science director of the International Peace University of Berlin, is an expert in the field of systems theory. Laszlo argues that it is critical that the world "move from economic globalization to a new and sustainable civilization." Such profound and rapid changes in human history are called Macroshifts. Previous Macroshifts occurred with the Bronze Age, the Iron Age, the development of agriculture, the Enlightenment and the Industrial Age.

According to Laszlo, the old order of humankind's evolution "can be encapsulated in three terms: conquest, colonization and consumption." The coming Macroshift will require a transformation to an evolution that relies on "connection, communication and consciousness."

Laszlo's Macroshift represents the kind of fundamental social and economic change that environmentalists have been championing. The question now is whether the advent of an accelerating technological Singularity will eclipse Laszlo's envisioned Macroshift and usher in a new world that leaves humankind diminished and alienated from life, from joy and from the natural world.

Copyright Earth Island Journal

 
 

   [Post New Comment]
   
Mind·X Discussion About This Article:

Bell's Warning Echoed by UN Scientists
posted on 05/22/2002 8:30 PM by friends@enteract.com

[Top]
[Mind·X]
[Reply to this post]

Bell's warning about the ever accelerating extinction rate is being echoed by UN scientists.

The United Nations study on the state of the global environment will announce tomorrow that almost a quarter of the world's mammals face extinction within 30 years.

The report, which reviews the past 30 years of environmental degradation as well as looking forward to the next 30 years, is understood to say that all the factors that have led to the extinction of species in recent decades continue to operate with "ever- increasing" intensity.

Scientists who contributed to the report have identified 11,046 species of plants and animals that are endangered. These include 1,130 mammals ' 24 per cent of the total ' and 12 per cent, or 1,183 species of birds.

The list of the critically endangered ranges from the well-publicised, such as the black rhino and Siberian tiger, to the less well known, such as the Amur leopard of Asia, the short-tailed chinchilla of South America and the Philippine eagle.

The researchers who helped to prepare the Global Environment Outlook-3 (Geo-3) report of the United Nations Environment Programme (Unep) also identify 5,611 species of plants that are facing extinction. They point out that the true figure is likely to be far higher given that only 4 per cent of the known plant species have been properly evaluated.

Re: Bell's Warning Echoed by UN Scientists
posted on 05/22/2002 10:28 PM by grantc4@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

The true extent of species extinction will never be known because so much of the forest and grassland that is being turned into farmland and ranchland was not surveyed for species and cataloged first. Indonesia, for example, will soon have little or no forest left. That means all the tree dwelling animals that lived in that forest will disappear with it. A report I saw on TV last night estimated that the thousands of islands of Indonesia will be bare of trees within ten years or so. Even the replanting of trees in areas where forests were chopped down will not bring back the animals that once lived there. It's hard to bring back what we don't know existed.

Of course, new species will spring up where the old ones died. Most of the land will be used to support farm animals and the parasites that live off of them. The prarie grasses will all become wheat and oat fields or rice terraces or fields of soy beans. The earth will become a cesspool of pestacides and plant food. The planet will support us, but it will be a world our parents and grandparents never knew. Farmers will be replaced by huge companies that manufacture food and distribute it with the help of intelligent machines.

Finally, we'll each be consigned to a little room where we interact with each other over a global network and receive our food through a slot in the door. Animals will be something we watch on the screen in front of us that were photographed and the images saved digitally before the world changed. Artificial reality will become the only reality and the life of the mind will end up being the only life available in a world too crowded for people to move around much.

Have a nice day.

Re: Bell's Warning Echoed by UN Scientists
posted on 05/23/2002 8:36 AM by jimht@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

How about some good news. The state of Wisconsin has recently added 810 more miles of trout stream waters! If you are unaware of what type of water trout need to live - it has to be clean and cold. Wisconsin waters have been on an improving trend for the last 50 years and it continues today. Along with the trout population neccessarily comes all the supporting ecology that may have been lost at one point (trees, brush, water plants, insects,...etc.). The improving water quality of streams and rivers has not just occurred in Wisconsin but nationwide. Mature societies demand a clean environment. We continue to leave our dirty industrial past behind. Advancing technology will continue to give us tools to lessen our impact on the earth (while at the same time give us new power to destroy things...yeah I know).

read "The Skeptical Environmentalist"
posted on 06/04/2002 12:39 AM by joesixpack@gobills.net

[Top]
[Mind·X]
[Reply to this post]

It will shed light on some of the "issues" that environmentalists constantly spout off...and may just shine some light on several lies that have been presented to you as truths.

Re: Bell's Warning Echoed by UN Scientists
posted on 12/20/2002 11:21 PM by James Jaeger

[Top]
[Mind·X]
[Reply to this post]

. . . Or we could colonize Mars, Grant.

James

Re: Technotopia & the Death of Nature
posted on 05/23/2002 9:46 AM by oortog@sympatico.ca

[Top]
[Mind·X]
[Reply to this post]

Well, its about time. Nature is not always a good thing. Nature itself is nothing but cold, hard indifference. To some people nature is downright toxic. Nature deems that it is good that 180,000 sentient beings die every day. Including YOU.

Screw Nature, if she deems I have to go, I'm taking the bitch with me.

Re: Technotopia & the Death of Nature
posted on 05/23/2002 5:52 PM by dr_snow@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

I assume ignorance is bliss in the case of this tumultuous individual. Death is a part of life. If it weren't, then you'd have to deal with fifty roommates in an apartment infested with millions of insects. Don't be stupid, put down the spatula and pick up a book.

Re: Technotopia & the Death of Nature
posted on 05/23/2002 10:04 PM by bk_2112@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

It's much more accurate to say that death, up to now, **has been, and is still for now, a part of life.** Humans are self-reflexively conscious, and therefore able to know about, and anticipate the prospect of, our own (and others') death. The impact of this on our minds long ago generated the impulse to seek immortality; it is hardly an understatement to say that we are **allergic** to death & dying. Probably more energy and thought has gone into generating, promoting, & defending religious & mystical systems to offer us comforting assurances that we are in fact immortal than into any other endevor in human history. Non-psychopaths who would ordinarily not dream of punching someone's face, can and have been roused to genocidal fury, participating in the slaughter of thousands(or even millions) of people who had done them no harm, simply to vindicate their beliefs and assure themselves of "life after death". We are in a position unique in history, to **realize** dreams that virtually the entire human race has held since long before recorded history. We can potentially become immortal, more than human, able at last to create bodies & habitats that match our conscious ability to envision and create. This is a prospect to be welcomed, all the while addressing the potential dangers in a reasoned way that drains the bath without harming the baby :-)

Re: Technotopia & the Death of Nature
posted on 05/23/2002 10:38 PM by grantc4@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

>; it is hardly an understatement to say that we are **allergic** to death & dying.

How can you be alergic to death if you're not going to be around to have an alergic reaction to it?

Re: Technotopia & the Death of Nature
posted on 05/23/2002 10:47 PM by bk_2112@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

You obviously can't have an (cognitive/emotional) "allergic reaction" after death, but you can have one to the *prospect* of death, or to seeing/contemplating the death of others. Religious/mystical memes are an attempt to deal with this.

Re: Technotopia & the Death of Nature
posted on 05/23/2002 11:40 PM by jmnewman3@hardynet.com

[Top]
[Mind·X]
[Reply to this post]

My questions are: what limits are there to experiencing things in the cybor world? How much could the self be simplified? Imagine if we no longer had to feel our heart beat. Or breath. If we shed our bodies, could we look in a cybor-mirror and watch our thoughts? Are there different ways to experience reality besides our sensations we have now?

Re: Technotopia & the Death of Nature
posted on 01/19/2003 10:37 AM by i cant believe you didnt noticed that

[Top]
[Mind·X]
[Reply to this post]

*DEATH* IS the ultimate allegic reaction of dying...

Re: Technotopia & the Death of Nature
posted on 06/03/2002 5:39 PM by apawi@aol.com

[Top]
[Mind·X]
[Reply to this post]

IMHO, physical immortality will never be attained for these reasons:

1. Even if life migrates to a non-biological platform, accidents do happen. One supernova explosion within a few light years of your host computer can wreck your day. Indeed, a longer-lived life-form must worry about events that would generally be of such a low order of probability for us that we don't even think about them.
2. If, as it seems probable now, the Universe continues to expand indefinitely, at some point, there won't be a material substrate upon which to base life, nor enough free energy. Granted, that's hundreds of billions of years or more away....but 300 billion years is just a speck compared to forever.

It might be that a subjective immortality might be possible...if it's possible for a universal computer to perform an infinite number of operations in a finite span of time...but I can't figure out the math that would allow that.

Re: Technotopia & the Death of Nature
posted on 06/04/2002 11:20 AM by apawi@aol.com

[Top]
[Mind·X]
[Reply to this post]

I meant trillions, not billions.

Re: Technotopia & the Death of Nature
posted on 08/31/2005 3:12 AM by kellyb

[Top]
[Mind·X]
[Reply to this post]

Are you sure you're not talking about other people rather than nature? Remember that in this day and age the cause of most suffering and hardship is not nature, despite earthquakes, tornadoes, etc. The biggest cause of suffering is other people. I think nature is not as cold as human beings. At least there is a reason for death in nature. Humans kill and torture pathologically.

Re: Technotopia & the Death of Nature
posted on 05/31/2002 9:50 AM by iksandar@freemail.org.mk

[Top]
[Mind·X]
[Reply to this post]

Death of Nature(?) Can there be a more untrue statement, or am I defining nature very differently from the author of the text. The very technology that's supposed to defeat nature(or kill or replace or whatever) is a part of nature. It's a natural principle that the rate of evolution will speed up exponentially (look at the whole four billion years of our planet's evolution), so the Singularity is merely a warning sign that a pivotal point in natural evolution is coming. It may mean the extinction of human existence (or even the whole biological variety of life) or it may be that humans will merge with the technology they helped nature create.
you (we) may only guess at these things... but we should never forget that there is nothing artificial (un-natural) since that would imply supernatural. You get my point?

Re: Technotopia & the Death of Nature
posted on 05/31/2002 11:15 AM by tomaz@techemail.com

[Top]
[Mind·X]
[Reply to this post]

Agree! To wait until a big gamma ray burst, asteroid or Sun furnace ... is the most stupid thing we can do.

I think all biological life will die out, will be replaced, very soon after the Singularity.

What is stored in the future for any lion or dolphin or whatever? A few tough years, before the agonizing death.

Preserving the nature is extending the durability of a hell, beyond the necessary point.

Upgrade all the sentients to god like creatures and kill all the others. That's the best what can be done.

- Thomas



Re: Technotopia & the Death of Nature
posted on 06/03/2002 6:03 PM by apawi@aol.com

[Top]
[Mind·X]
[Reply to this post]

Do you honestly believe that biological nature is hell?
Singularity or none, I personally believe that the biological world is worth preserving for its own sake. Call me a rank sentimentalist if you want.
While I think it is possible that the Singularity, if it does indeed happen, might result in the extinction of all biological life on Earth, it does not necessarily follow that such universal extension is inevitable, let alone desirable.
I suppose that our successors might decide they need all the energy and matter in the solar system for their own purposes, but, beyond that extreme example, I can't see many compelling Darwinian reasons for a universal biological extinction event. One could even make the case that extinction might be reduced, because our successors would not be inhabiting an ecological niche belonging to any other creature.
Finally, re your comment on killing all the non-sentient life forms...what's the cutoff point? Does a bird count? Geez, in my life, I've known some birds that I'd rather have around than some human beings I've known.

SW

Evolution is inefficient.
posted on 06/04/2002 12:43 AM by joesixpack@gobills.net

[Top]
[Mind·X]
[Reply to this post]

It wastes time and energy over vastly long time scales. By guiding evolution with planning and foresight. nature becomes improved. Nature 2.0.

Re: Evolution is inefficient.
posted on 06/04/2002 11:32 AM by apawi@aol.com

[Top]
[Mind·X]
[Reply to this post]

Evolution is not "inefficient". Machines can be considered inefficient because they have a purpose. A steam engine powered by burning wood is less efficient than a V-8 engine because it does less work per each BTU, for instance.
Evolution, on the other hand, cannot be considered "inefficient" because it has no inherent overriding purpose, at least that can be discerned empirically.
As for time scales--this is a subjective judgement. On astronomical time scales, biological evolution is very rapid. From the standpoint of a human lifespan, it is glacially slow. However, if a fruit fly were conscious, it would consider you and I to be glacially slow, as well.
As to the main point--I am less sanguine about the human species' ability to "manage" nature than I was a few years ago. There are always unintended consequences to every action, and nature has a way of "biting back." Perhaps our successors will have greater wisdom, but that remains to be seen.

SW


Re: Evolution is inefficient.
posted on 12/20/2002 10:07 PM by Vishala

[Top]
[Mind·X]
[Reply to this post]

Rubbish, evolved solutions are often better than solutions provided by human intelligence because human intelligence is incapable of taking into account every variable without resorting to estimates. Evolved solutions fit perfectly within their situations because they are responses to actual pressures not theorised solutions to theoretical problems.

Re: Technotopia & the Death of Nature
posted on 06/04/2002 12:10 PM by altima@yifan.net

[Top]
[Mind·X]
[Reply to this post]

All non-sentients beings killed? Whaat? I personally think that all sorts of animals can feel pain, and would like to see the nonconsensual food chain broken after the Singularity. Also, isn't this whole thread just a bunch o' crap, basically? People need to remember that superintelligences can do *pretty much anything* within the laws of physics, such as simulate the entire ecosystem in a crystal the size of a rock, or superintelligences could immediately migrate to the center of the moon, etc. In my model of the Singularity - there is no in-between, either you get the AI/upload/whatever right, and everyone is happy, no complaints whatsoever, because the superintelligence can solve ALL the problems that the best humans could have solved, like preserving nature for its inherent goodness. The other option is that we all die immediately - not anthropomorphic sci-fi scenarios - just sudden and non-scary disassembly. People always, *ALWAYS* forget that the thing behind the Singularity is transhuman superintelligence - us humans are going to be arguing "blah blah blah blah", about ethics, happiness, etc, and the Friendly Superintelligence is simply going to come along and say "here's the solution. enjoy."

Re: Technotopia & the Death of Nature
posted on 06/05/2002 6:37 AM by tomaz@techemail.com

[Top]
[Mind·X]
[Reply to this post]

> All non-sentients beings killed?

Yes. They a precursors. At any time some sentients may pop up, among them.

But when the Singularity is done, there is no need for them to suffer, any more.

Sentients should rather be produced inside some Singularity controlled friendly environment (in larger numbers). Matter is needed of course!

> Friendly Superintelligence is simply going to come along and say "here's the solution. enjoy."

If it's any good - yes, that's the way to do things.

- Thomas

Re: Technotopia & the Death of Nature
posted on 10/30/2004 10:39 PM by hunter870

[Top]
[Mind·X]
[Reply to this post]

We may already be in a Singularity controlled friendly environment.

Who's to say that some entity, much akin to that proverbial klutzy nerd in a "Jr. High School science class" equivalent, wasn't given a "weekend homework assignment", and our existence is nothing more than a science experiment gone awry.

Instead of a SuperIntelligence saying "here's the solution...enjoy, it may be more of a "whoops!"

Re: Technotopia & the Death of Nature
posted on 10/31/2004 12:22 AM by Cyprian

[Top]
[Mind·X]
[Reply to this post]

Absolutely. There's a great deal of evidence, substantial and otherwise, to suggest that creation was a mistake. So many shrug off this "dangerous" idea and resort to vaccuous religion :/

Re: Technotopia & the Death of Nature
posted on 11/17/2002 2:02 AM by devios

[Top]
[Mind·X]
[Reply to this post]

Good point, the entire time i was reading this article i was thinking, what the hell? Technology vs Nature? Is there anything more obsurd? Think about this, if we all hook ourselves up to a computer, say like, download our consciousness into a "virtual world" wont our thoughts and actions be moving at the speed of light? so technically we would be moving through this virtual world at the speed of light. Time dilates as you approach the speed of light, reaching infinite dilation upon arrival of C(constant, 300,000 km/s). This means that time is of no consequence, we could technically live for ever, virtually, until we decide to unplug. As for over population and an overcrowded earth, I don't think we will get to the point of living in little rooms with nothing but a screen in front of us and a little slot for food. We have a thing called space travel. We have to colonize the rest of the planets in order to harvest their energy resources for further scientific observation of ever increasing-lower energy levels, i.e. planck particles or whatnot. This colonization gives us the ability to keep our current bio-diversity and expand our own population and technology to never before expected lengths, so long as we impliment poulation controls on humans and neo-superhuman intelligences. Dyson Spheres! The stars are the answer.

Re: Technotopia & the Death of Nature
posted on 11/04/2003 10:51 PM by sohail_somani

[Top]
[Mind·X]
[Reply to this post]

Ok. I will give the previous poster one thing: s/he was imaginative! However, your the sentence "The Stars are the answer" is something I believe we should be discussing as well.

With all these advances in nanotechnology, biotechnology, computer technology etc, it is not just human experience that will grow. It will be human knowledge. Have you heard of terraforming? It is basically the process of taking an inhabitable planet and making it habitable (for whatever thing wants to live there)

That is truly the next leap for man(kind). Chances are, all these advances that Joy and the others insist on controlling or abandoning completely are the very same technologies that will give us life on other planets after we have exhausted our resources here.

Of course, in a topic like this, I must define what I mean by life. As an avid Star Trek watcher, I define future human existence to be similar to the Borg.

The Borg are an example of evolution that fits our situation to a tee. They started off as humans and used technology to gain an advantage over other species, eventually assimilating (or attempting to assimilate) all other species that they come across. We do this on a small scale when we use our guns to protect ourselves from the big bad lions.

So why exactly won't technology overtake us? Unlike other theories in this thread, I am uncertain about the answer to that. But I believe part of the answer to this implied assertion lies in the validity and truth of the following argument: If machines do advance to the point where they are indistinguishable from other intelligent biological entities, then they will need biological resources to continue this trend. Whether the resource is grown in a petri dish or internal to the organism is irrelevant.

The reason I believe a biological basis is inevitable for machines is that due to the theory of evolution, the "machines" will choose the most efficient way of reproduction. When resources are abundant, this may be through factory-style manufacturing reproduction. However, when the resources do run out (and they will), the machines will likely look to a biological resource to survive as has been the case with human evolution. Once they reach this point, they will not go back even if they can. It will likely be efficient enough for them as it has been for us.

With the last argument, I think you see my point: Any machine we make, will eventually become an entity depedent for reproduction on an inexhaustible resource, namely itself. Of course, this is assuming that resources get exhausted. So, what I believe is the final result out of the "Singularity" is actually a biological organism. Just like us!

The ultimate machine is us + some peripherial computer stuff. Peripherial stuff may include a global consciousness in the form of a biological network *but* that is the end of it.

Maybe I'm just as crazy as the previous poster!

Re: Technotopia & the Death of Nature
posted on 07/04/2002 12:21 PM by sardaukar70@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

James John Bell's assertion that transhumanists "can hardly wait for the Singularity to arrive," and that they "are actively organizing not just to bring the Singularity about, but to counter what they call 'techno-phobes' and 'neo-luddites'' are as sweeping as they are unfair and inaccurate. Not all transhumanists believe in the Singularity, while many of those who do are very worried about its potential negative ramifications. In fact, transhumanists are actively seeking ways in which to ease the transitions of the coming years by encouraging research, foresight, and open discussions. They are not so much 'countering' anti-technologists as they are on the defensive; it seems to me that individuals such as Bill Joy and Jeremy Rifkin are the ones on the attack (not to mention Ted Kaczynski). Moreover, transhumanism and environmentalism are not mutually exclusive terms; there are a number of greens that are active members of the transhumanist community. I strongly encourage Bell and others to read up on transhumanism (www.transhumanism.org) and stop proliferating misinformation about this broad-based grass-roots movement.

Evolution, Technotopia & the Renewal of Nature
posted on 07/23/2002 3:48 PM by info@gulfaliens.com

[Top]
[Mind·X]
[Reply to this post]

I think that it has been well established that the current and forecast rate of extinction is higher than the average in the Earth's evolutionary history. It is useful to note that there there have been "events" in the Earth's past where the rate of extinction has surpassed that average, and even the current rate. It is equally useful to note that the Earth has "recovered" from these events by replacing extinct features with new ones. This is a natural process. Nature did not "die". Nature cannot "die". It can transmute and evolve. It is not a static entity but a dynamic system.

The debate here centres on whether or not modern anthropogenic activities are natural or not. Even though our species has the ability to modify our behaviour to the benefit of other species and the environment, we can also choose to not modify our behaviour. Either way, our species, evolved via "natural" processes, is as natural as any other species and by extension all of our behaviour, beneficial or not, is natural.

However, dismissing the overly optimistic arguments, put forth recently by Bjorn Lomborg and earlier by Julian Simon, there is evidence that the rate of negative anthropogenic impacts on the Earth could be reduced over the long-term. Within local jurisdictions, per unit of energy or per capita negative environmental impacts have been falling over the short time that efforts have been made to reduce them. That said, even the current reduced rate, and absolute value, of impacts is too much. So while there is a trend toward reducing impacts, in the short-term it is likely to get worse. Both short and long-term impacts are due to escalating anthropogenic application of technology; technology being defined as all anthropogenically created tools.

In order to limit the short-term negative impacts, it may be advisable to dramatically accelerate the creation of technology. This acceleration must be focused and selective towards reducing impact. Moving anthropogenic activities into the nano-realm and "virtual space" of information technology will reduce, and eventually eliminate the "human footprint" from the face of the Earth.

The worst thing to do would be to halt or stifle technological growth and allow "undeveloped" anthropogenic activities to continue negative impacts at current level.

Re: Evolution, Technotopia & the Renewal of Nature
posted on 07/23/2002 5:06 PM by azb@llnl.gov

[Top]
[Mind·X]
[Reply to this post]

I am somewhat a singularity supporter, but conflicted as to what I see to be the varied motivations.

At the most base level, some feel that it represents a ticket to (near) immortality, the conscious continuation of their present-like self, with a lot of nicer toys to play with.

Others see it as helping to guarantee the "continuation of humanity", in varying senses of the word.

Finally, others see it most purely as the way to have the universe "maximize its positive utility" (utility to what? positive how?)

Each of these views needs to be examined for the degree to which they "make real self-consistent sense".

Suppose that current-humans fail to "save themselves" (get wiped out by robo-bio-plague, along with much of the current biosphere.) As was pointed out, "nature continues", and perhaps withing a few million years, (say) some species of former lizard, gain par-human intelligence, develops the "good" human qualities (compassion for others, etc.) If this were to come to pass, then in the extended sense of "Humanity = Continuation of Positive Human-like Values", humanity HAS continued.

If the future intellizards succeed in bringing about a Friendly Singularity, then, in this view, all is perfectly well. And the only reason we must strive to bring singularity about ourselves is that there is no guarantee that the lizards would succeed, so every species that gets to take a crack at the problem is rather obligated to try.

And if instead, the intellizards succeed with Good Friendly Singularity, they will likely create countless populations of sentients to exist and thrive in this new substrate. All would be well, and "WE" would continue to the infinite something.

Let us return to the baser motivation, "I want the present-sense-of-me to continue". This position seems untenable, and based upon a fundamental fallacy of belief in an extrinsic "self-that-continues" from moment to moment.

The "me" that began this letter "passed on" the moment it began the thought of writing. That past-person does not exist this moment, to know whether the momentarily-present-me continues that thought. It might have believed so, and the present-me certainly feels (for just a single moment) to be the "continuation" of that past self, but that is, as always, only a sensation of the moment.

If we succeed in creating a Friendly Singularity, sensitive to our (initial) guidance, what do the "present-we" expect to gain by uploading our momentarily transient sense of selves? Why create consciousnesses to exist in this new substrate that are saddled with carrying the awkward baggage of memories of "our" particular and individual histories? Even if we did this, "we" would not get to "be there", any more than "we" get to see tomorrow.

Better perhaps to engineer completely new conscious patterns, and (if sense-of-THEIR-continuity is so important), seed them with pleasant invented memories to being with. Engineer them with good qualities (curiousity? compassion?) and leave out the bad (fear? avarice?). They will be free to choose further future in a much more clear-headed, and unconflicted manner than "our" patterns would.

So, what do YOU want from Singularity?

Cheers! ____tony b____

Re: Evolution, Technotopia & the Renewal of Nature
posted on 07/23/2002 6:26 PM by citzenblue@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

azb,

You are my voice. I would like to thank you for your wisdom; although I do understand what the others are saying most vehemenently.




Nathan C

Re: Evolution, Technotopia & the Renewal of Nature
posted on 07/23/2002 9:03 PM by azb@llnl.gov

[Top]
[Mind·X]
[Reply to this post]

Thank you Nathan,

Its nice to know that I make a bit of sense, now and again :)

I hope others take my thoughts into consideration. Some part of me knows that this point in time is, possibly, our only viable opportunity to take a shot at "good singularity" (or at least, well-behaved Super-AI). If we blow it, a future of randomly conflicting super-technologies will likely cause rather painful upheavals.

But we need to know just what we really INTEND in attempting to bring about a well-behaved Super-AI. If we are confused or conflicted in this, we are not likely to meet with great success.

Cheers! ____tony b____

Re: Evolution, Technotopia & the Renewal of Nature
posted on 07/24/2002 2:04 AM by citzenblue@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

vehemently

Re: Evolution, Technotopia & the Renewal of Nature
posted on 07/24/2002 2:28 AM by citzenblue@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

Thank you azb. I would assume the Hegelian thought of approach, and aspire for progressed, advanced, future, and all that is benificent in this enterprise. (of course war or conflict should never be included; especially in our times.) C.G.Jung's archetypes of wholeness should be our mantra.

Nathan C.

Re: Evolution, Technotopia & the Renewal of Nature
posted on 07/24/2002 2:29 AM by citzenblue@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

beneficent.

Re: Evolution, Technotopia & the Renewal of Nature
posted on 07/24/2002 2:09 PM by info@gulfaliens.com

[Top]
[Mind·X]
[Reply to this post]

There is a sci-fi book called "Diaspora" by Greg Egan which is one of the best illustrations of what a future "humanity" might be like after a singularity. There are biologically altered humans and "humans" that have migrated or have been "born" in a computational substrate. The virtual humans live as long as they want to in their computational substrate but over long periods of time they change personalities, etc, either incidentally or purposely.

I think a "Good Friendly Singularity" could still include entities or consciousness as long as it accepts the fundamental premise that it will change with the passage of time. Although it does not always manifest itself, current day humans certainly have this capability.

More importantly, as underscored by tony b, is the motivation and desire to work toward a singularity. The problem here is that the motivation and desires driving technology are based on attempting to ensure the permanence of current day humanity. Motivations such as attempting to prolong lives through biotech, for example, are really only short-term goals. The question is are these enmeshed in longer-term goals which are yet to emerge?

However, as the focus of our discussion, the possibility exists that we may remain fixated on those short-term goals not by choice but because we simply are not able to further. There are authors who argue that our social and emotional capabilities and growth significantly lag technological growth. The 'Ingenuity Gap' by Thomas Homer-Dixon addresses this premise well.

What is the mechanism, other than the submission of warning call articles to Wired Magazine by prominent technologists, for ensuring that the gap is bridged? There have been many polls where scientists agree on the need for caution. Many scientists and technologists write about the risks of 'run-away' development.

Perhaps though, this rising crescendo of discussion is part of the evolution of this mechanism. Given the rapidity of the development of technology, as these short-term goals are achieved, maybe these longer-term goals will as rapidly achieve prominence in the consideration of the evolution of technology.

The development of environmental regulations, for example, stemmed from the rising crescendo of discussion around environmental issues and the raising of consciousness and awareness of the need to address them. Perhaps this is a model and map of the process and mechanism we need to address the run-up to the singularity.

Re: Evolution, Technotopia & the Renewal of Nature
posted on 07/24/2002 3:30 PM by citzenblue@hotmail.com

[Top]
[Mind·X]
[Reply to this post]

Could there be a correlation in the Catholic Church's problems, and it's refusal to accept most birth-control practices; and possibly be relational to overpopulation? Would this prove that systems that do not fit evolutionary demands will diminish, and possibly go extinct.
I can see the landscape of probability bending to allow 'Singularity' to come about. This seems to me like hard science, maybe nature trying to find the most efficiency when given limitations.
I can also see a relation in businesses finding better ways to econimize to bring in more cashflow. We must admit that with Moore's Law etc., we are going in the direction of 'Singularity'.

Re: Technotopia & the Death of Nature
posted on 10/27/2002 1:18 PM by Cesar

[Top]
[Mind·X]
[Reply to this post]

I think that the singularity is to ensure survival for those that are alive when it arrives, buy instead of reinventing our humanity or giving away what is human. We should instead with all the technology that will enable us to manupulate atoms and beyond. We should make the perfect human world where you don'tfight for making new resources to build or produce all that is necessary for any given Lifestyle. Now that there would be infinite resources and Rational and Understanding and superior consiouness they would take care not to let chaos be unleashes on the human world.

This may sound like we need a God, well it may seen that way, but instead of saying there is something protecting us, when there is no power to do those things at all, we should and those humans should understand the mechanisims producing Unlimited resources to make the world on thier imaginations come true and understanding to live with diversity while never having to extinct any lifestyle, and those desiring to have higher consiousness beyond human level should learn to understand not to destroy those consiousness, from little plants to humans and beyond.

I believe that higher orders of magnitude of consiousness, should take care of handling problems or physical problems at their level of understanding. They should only introduce a radical tool or practice that affect the systems that is being applied to, when you have some kind of quantum or nanomachines that can reverse any changes made to the 100% effective programs that are already operating and handling that society or consiousness.

See we can only get to those scary scenarios, when one elite controls the human world, or uncontrolable nanomachines spread wiping out the human race, or one government or religion decide that they have enough of what they need, is only going to happen if we let technological progress go on without first building rationalism, educations and understanding to cooperate and help fellow humans and non-humans to survive without wiping out compiting consiousness. The people that can make this scenarios happen only think that they can only achieve success at the cost of other people, but this kind of thinking is like a computer virus only wanting to replicate its code because its the only way they can survive, reproduce and enyoy life and since we have this range of negative emotions like fear it is no brainer they have deep fear of losing thier stimulations if they tried other alternatives, but this only happens because they haven't learn cooperation, understanding of others and using technology to minimize conflict.

Since you may be asking yourself at this point how are they going to pay attention to all this, since they have already rationalize thier way of life. Well at some point the leader or people in the rise to replace the leader will realize that by not educating about cooperation, understanding of others and using the tools of the technologies that can enable a singularity, they will never wipe out the problems that arise when dealing with other societies, well they may be able to wipe out thier competition, but since they dont think on advancing equality withing thier society, problems will always arise and they will be doom themselves by not taking positive action.

Re: Technotopia & the Death of Nature
posted on 12/20/2002 11:47 PM by James Jaeger

[Top]
[Mind·X]
[Reply to this post]

>Joy understands that the greatest dangers we face ultimately stem from a world where global corporations dominate -- a future where much of the world has no voice in how the world is run. The 21st century GNR technologies, he writes, "are being developed almost exclusively by corporate enterprises. We are aggressively pursuing the promises of these new technologies within the now-unchallenged system of global capitalism and its manifold financial incentives and competitive pressures."

Anyone, who says that it's not relevant to discuss politics and economics here at the MIND-X is simply not getting the whole picture.

James Jaeger

Re: Technotopia & the Death of Nature
posted on 12/30/2002 11:05 PM by Mitch

[Top]
[Mind·X]
[Reply to this post]

WHo's to say that increased technology will neccesarily destroy nature. Isn't it possible for us to improve nature and still develop? Increased efficencies could easily stop so much demand as to make it possible for nature to recuperate, and the saved wastes and energy would likely be able to be profitable. Thus nature lasts, and increased efficencies (spelling?) would allow us to reach the singularity.

Re: Technotopia & the Death of Nature
posted on 12/31/2002 1:06 AM by Grant

[Top]
[Mind·X]
[Reply to this post]

>Isn't it possible for us to improve nature and still develop?

Once we start calling the shots, it's not nature anymore. It's technology.

Grant

Re: Technotopia & the Death of Nature
posted on 12/31/2002 7:34 AM by tony_b

[Top]
[Mind·X]
[Reply to this post]

By what measure is man, or technology "destructive" or "unnatural"?

Long ago (we suppose) before the onset of :life", the winds and chemicals eroded solids into sediments, naturally. Then along came "life" to destroy it all, reassembling chemicals into complex structures. How unnatural the amoeba!

Is it unnatural that species evolved (and many gone extinct), long before man arrived? Who is to say, except ENTIRELY abitrarily, that man's creations, or destructions, are in any way "unnatural"?

If every planet in the universe that engendered "life" eventually produced (thereby) such toxins that all life was extinguished, that would seem inescapably an attribute of nature. Why is man so special that "he" is deemed unnatural, or capable of behaving unnaturally?

If an animal evolves a form so "successfully" predatory that it wipes out its own food supply, and then starves, no one calls that "unnatural".

If man's technology destroys himself and the planet, THAT is natural. If he can prevent such a destruction, THAT is natural.

I prefer not to use the term "natural" at all, as it somehow suggests that man is to be measured by a different yardstick, or "physics" than everything else.

Select a term that conveys some real meaning, for if "man" is to be the one who behaves unnaturally, then EVERYTHING man does is unnatural, including any attempts to "harmonize" with that very destructive force we call "nature".

Cheers! ____tony b____

Re: Technotopia & the Death of Nature
posted on 12/31/2002 10:13 AM by Grant

[Top]
[Mind·X]
[Reply to this post]

One of the things language does is divide the world into arbitrary categories. Over the past few centuries, one of those divisions put man on one side and nature on the other. Thus we started calling man-made things unnatural. There is no law that says we have to divide the world up that way. It's just the result of how we came to see the world as we took over the rebuilding of it and tossed aside natural selection as a way of changing life and our environment.

At the rate natural selection makes changes, we would never have survived the rise to six billion people. We may not survive it now. But because we are changing our environment to suit our own personal needs rather than adjusting to the needs of the planet, many people tend to see what we are doing as contra-natural and what nature does by way of fighting back (HIV, war, disappearance of natural resources, drought, el nino, etc.) as natural. But the world can be divided in other ways and new names given to the new divisons.

We don't have to look at our world that way. It's just part of a tradition that blossomed at the time of Darwin and took off with Watson and Crick. The singularity may not leave time for new traditions to develop. Traditions require the element of time over which to keep what we admire and toss what we don't. I just hope we find something worth keeping in all of this.

Grant

Re: Technotopia & the Death of Nature
posted on 01/01/2003 8:25 PM by Ecowarrior

[Top]
[Mind·X]
[Reply to this post]

I've never read so many ecocidal messages in my life. For you people who want to destroy nature what exactly do you think allows you to live? This a classic case of the dog biting the hand that feeds it and this is why, as one poster put it, the parasites on this planet should be removed.

Re: Technotopia & the Death of Nature
posted on 01/02/2003 12:56 AM by Grant

[Top]
[Mind·X]
[Reply to this post]

Without life there would be no death. Without death we would not appreciate life. They are both part of the same process. Nothing on this earth lives because it deserves to. We all live because we are part of the process. We have no control over it -- only the illusion that we do. All life begins in conception and ends in death. In between, we consume our environment until there is nothing left to consume. When we are gone, something new will be conceived that can live on the altered environment. And the process will begin all over again. Nature is just another name for the process.

Grant

Re: Technotopia & the Death of Nature
posted on 01/02/2003 1:22 AM by Ecowarrior

[Top]
[Mind·X]
[Reply to this post]

Grant I don't quite follow you. There's a big problem brewing on this planet that a lot of you apparently don't see and thats the fact that an organism cannot continue to survive if it completely destroys its environment. From what I gather I see a lot of people posting messages here that say that nature as we know it is irrelevant and that it's ok to destroy it as long as it serves our purposes. Nature is not ours to do with as we please and we have a responsibility to see that it goes uninjured. AFter all, how will you feel when some big robot decides humanity is just a part of irrelevant nature and eradicates you? Would you find it morally justified in that regard? If people really want to do all of these strange Frankenstein things they should do it on a lifeless planet and leave Earth alone.

Re: Technotopia & the Death of Nature
posted on 01/02/2003 10:28 AM by Grant

[Top]
[Mind·X]
[Reply to this post]

I've commented on this topic many times in the past and my conclusion is that there's little we can do to return to the past. It's sort of like saying "Let's hurry up and close the barn doors because all the horses have run away." Closing the doors will not bring back the horses.

Too much of the planet's biodiversity has been wiped out already and we still have six billion people to contend with. China is now becoming as great a drain on the planet as North America and Europe with over a billion people building cars, electric plants, superhighways and factories, and turning out the largest group of biologists devoted to genetic modification on the planet (read China: The New Cloning Superpower in the January issue of Wired magazine). India, the next most populous nation, is envious of their success and is right behind them in technological development.

The oceans have been so fished out that the fishing industry is fast becoming a fish farming industry and most of the shrimp on the tables and in the supermarkets of the world were born and raised on wet land in Aisa and Mexico. The wild species that once flourished in the tropical nations around the world are on the verge of extinction to make room for farmers to grow genetically modified corps and animals for human consumption. Before we can reverse these trends, the planet will most likely react in ways we can't predict to reduce the surplus population for us. We are as fragile as all the other species on Earth and it won't take much change in the weather or the thin layer of ground we grow our crops on to create an environment too hostile to live in.

Even if we were completely wiped out by some virulent disease tomorrow, the world would never return to the way it was. It would merely start over from where we left it and produce something entirely different from what was. We can't go back. We can only try to reach an accomodation with the planet and no destroy it so completely that we end up destroying ourselves along with it.

That is the gist of my message above. Whatever we do will have consequences. A great many of them will be catastrophic for both people and other species of plants and animals. They have been already. The plants and animals we have wiped out will never come back. New ones will use what remains of the genomes that still exist to evolve new species to take their places, but what is lost will most likely never be recovered. That's the sad fact and it's a little late in the process to try and reverse it. It's like trying to stop the rain when the clouds are full of moisture and ready to drop it on our heads. Whatever we do will be too little and too late and probably just as harmful as what we've been doing.

In fact, change is overtaking us so fast right now, I doubt we can even keep up with it.

Grant

Public radio & the Singularity
posted on 04/24/2003 7:46 PM by Clifford

[Top]
[Mind·X]
[Reply to this post]

Environmentalist Bill McKibben has debated Ray K. at the Washington Cathedral.

McKibben appeared on Wisconsin Public Radio (4-24-03) and I asked him to comment on Bell's Singularity article in the recent issue of The Futurist. McKibben said that intelligence augmentation, nanotechnology and finally, the Singularity, are just going to be used to further entrench the rich and powerful.

McKibben has an article in the April '03 issue of Harper's magazine.

Re: Technotopia & the Death of Nature
posted on 11/05/2003 5:08 PM by Steven23

[Top]
[Mind·X]
[Reply to this post]

It is undoubtedly true that every step we take technologically comes with greater risk. The fact that technology is becoming more and more advanced means that the potential for increased prosperity and global disaster are increasing at an equal and exponential rate. Many skeptics believe that as a result, the possibility of the destruction of nature has propelled into the stratosphere.

However, we as humans have grappled with power of similar orders of magnitude for some time now. We have had the ability to destroy this earth ever since weapons of mass destruction were developed in the mid 20th century. With the inevitable wave of GNR technologies on the verge of becoming a reality, it is not difficult to think of a few disastrous scenarios that could take place. Although these new technologies bring new dangers to the table, the concept of how to deal with these dangers remains the same. With this great power comes great responsibility, and it will take patience, extensive testing, and good protection methods for this new technological forefront to be beneficial to earth and its inhabitants.

Through generations of research and empirical analysis, we have come to understand what is beneficial to our ecosystem and what harms it. Moreover, in addition to providing better quality of life, modern technological advancements have given us a myriad of possible ways to help protect the environment that were never before possible. For example, a company by the name of Distributed Science here in Toronto is said to have a peer-to-peer network of distributed computers that are working on ways to eliminate nuclear waste disposal. Further down the road, the development of GNR technologies such as nanotechnology could possibly assist environmental reconstruction tasks, such as fertilizing soils and purifying lakes and rivers. I believe that future technological advances will make it possible for us to reverse the damage on the earth's ecosystem that past technology has done. I think with time, technology and nature can learn to live in peaceful harmony.

Technotopia & the Death of Nature: Nature matters
posted on 11/10/2003 9:03 AM by julian_h99

[Top]
[Mind·X]
[Reply to this post]

Our world has demonstrated that it may be able to do without the assistance of nature. With the advent of cloning, we are capable of replicating endangered species out there, or even creating a species more to our liking. So it is easy to imagine, though how terrible it may be, that we may no longer need nature and what it has provides us for thousands of years. It is the morale of it all that we should hold dear. We are destroying billions of years of evolution that earth has slowly created and formed in an intricate pattern of survival of the fittest. Nature's biggest mistake was perhaps creating us to destabilize the perfect balance, something that was perhaps too 'fit' for this world. The evolution of human bipedalism and encephalization (increased mass in cerebral cortex) has enabled us to surpass our fellow species. It took the earth billions of years to create intelligence such as ours. We are taking hundreds of years to replicate that feat. Earth has been unsuccessful in preventing us from destroying it. What makes us think we're capable of stopping our own creation if it were to ever get out of hand? Earth has created us and we are destroying it, we are creating Artificial Intelligence, isn't it fitting that it eventually destroys us? One can only feel a sense of irony.

As Kurzweil have educated us with an introspect at what is the future of our technology, we can see that it is only a few years away till we can create things that are superior to us. As technology's capabilities continue to grow exponentially, what makes us able to control something that can easily spin out of our grasp? We are easily and busily creating things that are better, faster, and smaller, that we have failed to focus on creating things to slow it down or deter it. However, doing so is just economically unacceptable. Our whole economy is centralized now in technology and making it more attractive to the consumers. So one can easily say that better, faster, and smaller is the way of the future, even though it may perhaps lead us to our own downfall. We are already given a warning by Bill Joy that we should be weary of the dangers that we are harnessing. But can a warning, much less a statement, stop the global economy and make it come to a standstill? It is taking decades for car companies to put a halt towards gas-driven automobile production and instead take on other means of energy. And yet it still is the same problem today. Now how does one stop global research of technology, which causes no direct harm, when we can't stop production of gas-driven cars which has created a hole the size of Antarctica in our ozone? One can only feel helpless at what seems like the inevitable.

So what am I trying to say in these two paragraphs of mumbo jumbo? The bottom line is life is meant to be preserved; whether it is a pesky fly, or our own dominant selves. And whether the question will become that we need nature to co-exist with ourselves or not, who are we to decide and play god? And since when did technology become more important than nature? As Craig Holdredge has stated, 'The science of biology is losing its connection with nature.' Technology is wonderful and its capabilities are beyond wonder. The creations of MRIs and CT scans have saved millions of people ever year, but so has natural medicines from all over the world. We hold the key to the earth's future; should we open a door that leads to a world of technological bliss, or a door where nature still plays its role as a fellow resident? Well that is up to us to decide, or actually, the CEOs of global technology companies that seeks the billions of dollars to flow into their cybernetic hands.

Re: Technotopia & the Death of Nature
posted on 11/14/2005 5:24 PM by braveheart

[Top]
[Mind·X]
[Reply to this post]

I believe that Mr. Bell brings up many interesting points about the future but many of his views are very utopian. The concept of singularity and nature and technology becoming one in my eyes is very unrealistic, because nothing lives forever, and life and death is a never ending cycle that we see in every aspect of life, and in the universe. To say that 'we are on the edge of change comparable to the rise of human life on Earth' (Vernon Vinge) is a bit of an overstatements especially because we do not know when, how and where life actual begun. Sure human life is a big part of our existence, however to nature as a whole we are no different then the dinosaurs that ruled the earth, yes we are smarter, blah, blah, but in the end we are just as sensitive and dependent on nature and it's forces as they were. I do not believe I have to prove my point simply by all the natural disasters such as the hurricanes and floods that have occurred this year that we could do noting about but watch as they bring total destructions to human creations. Entire cites that took decodes to built in the most advanced nation in the world was destroyed within a few hours by a simple forces of water and air. One thing that man as a whole has and always will have that nature does not is hubris. We take too much pride and give too much credit to the things we create. Nothing we make will be perfect because it's man made, and nothing that is man made will be superior to the forces of nature.
I'm sure as the technology advances we will make great leaps in the artificial intelligence but will we achieve it in the near future is a big if. However the key word here is 'artificial' not real, which is man made. Anything we create will be only as good and as advanced as our technology is and our technology is nowhere near the human brain. Sore we can measure how many calculations human brain can do, and how many computation a computer can do, but intelligence is much more then simple calculations. Intelligence is shaped through experiences; it's affected by our personalities, religion and many other factors that we can't even imagine to explain. How can we possibly make machines that ere self-sufficient and more intelligent then us when we ourselves barely have an understanding of how our human mind works? Sure we have made great leaps in its understanding but there is so much more that we still do not know. We can make all the predictions we want about the effects of technology on our civilization in the future, but there are way to many uncontrolled variables in this equation that anything we say is mare speculation and nothing else.
As to for the world population growth problem I believe it's a simple supply and demand problem with a twist. Just as the author said we will reach an asymptote and it will be effect by few variables in our society. As we can see with today western civilizations population growth is barely on the rise because there is no need for it as technology is taking over jobs, making them simpler and require less man power. Almost half of the world's population lives in countries that could be labeled third world countries whose technology is nowhere near the top nations. Those nations population will continue to until grow until their societies are able to adjust to the technology movement. This is a problem that many may not see as one but, what is going to happen when those nations can no longer support them selves, where will those people go? One major issue that I believe that author brought us that may be become a problem is the advances in nanotechnology, especially that replicating ones. I myself am a Stargate SG-1 and Atlantis fan, and this reminds me of the Replicators, who are a mix of the Borg from Star Trek, and the Wraith from Stargate Atlantis. With nanotechnology we may create something that we can't control and its main purpose will be to replicate at all cost. I think nano-technology is just as dangerous if not more then cloning because it can't be controlled.
To wrap things up I believe that humanity as a whole is too young to truly understand and be capable to create true artificial intelligence that would be able to sustain itself and grow as a society for longer periods of time without bring itself to total destruction. Such things will not be possible until we are able to true understand the inner working of life, and we are may never be able to just because of he way we are. Until we eliminate things like, hunger, racism, war, greed, diseases and money our society will remain in much more danger of being destroyed by itself then by an intelligent life form we create.

Re: Technotopia & the Death of Nature
posted on 11/14/2005 11:19 PM by eldras

[Top]
[Mind·X]
[Reply to this post]

i dunno about nothing loves for ever.

presumably stuff does.

By living you may mean 'holds it's recognisable pattern'

But then you are onto diffrent definitions of life.

life is really a biological subject, and new tech has to go much futher.

being scanned and written as an equation...to be made up in a local hospital at some future date, could mean you are still alive as long as any accurate repesetation of you exists.

Or quantum archeology, based on quantum probability maths, could raise you would mean that you potentially never die.

my mother just died and i saw it as a mixing of the earth, although I expect to see her again post singularity though resurrection science:


I cant believe a person is going to be SO complex to a superintelligent computer as to make theor resurrection impossible.

Resurrection will just be another branch of medicine.

"in the old days, they thought death was the end!"


As for how long a pattern can be sustained, I dunno. As long as there are records.

I would expect to meet other intelligences in space who have gone through this, and a shortcut to a singularity would be alien contact

Re: Technotopia & the Death of Nature
posted on 03/30/2006 10:12 PM by Tasneem

[Top]
[Mind·X]
[Reply to this post]

I agree with the previous poster, 'braveheart'. Technology and nature can never 'become one'. We are either natural or we are not. Just because we can invent implantable RFID tags for humans, or computer hardware that can 'surpass human brain power', or software that can emulate the human mind, as James Bell mentions in his article, does not mean that we will be replaced by 'superhumans'. I choose to define 'natural' beings by beings that need nature's raw resources to survive and grow ' food derived from trees, animals feeding on plants, etc. Unless we can find a way to start living off of things such as the solar energy or battery power, we are still part of nature. So the concept of 'Singularity', where nature and technology will become one is somewhat out of reach, as far as I can see. We are separate entities from tiny computer circuits built by us.

I also agree that we can't really create artificial intelligence that can surpass the human mind when we don't fully have an understanding of how human brains work. Intelligence is definitely measured by more than the number of calculations possible per second, and its superiority depends on many factors such as a person's unique personality, imagination, experiences, creativity, and intuition. These qualities play a big role in technological progress. How can we continue to progress and invent new things without imagination and creativity? It is well known that many of the most useful inventions were caused by accidents followed by creative thinking, such as the Scotch tape and velcro [1]. Also, experience is required for many employment positions for a reason. In-depth study of a particular subject is not enough to be able to hold certain positions. A person with no experience in helping people would not make the best candidate for a nursing position at a mental institution. A lot of knowledge and 'intelligence' comes from experience. It is impossible to take into account all the possible things that need to be considered while performing a particular job. I think the human mind is much more powerful that any artificial intelligence we can create at this point.

As far as the threat of humans eventually turning into 'Borg' like creatures goes, I don't think we have much to worry about. It is highly unlikely that all humans will choose to put cybernetic implants on babies by default as soon as they are born (or right in the womb, for that matter). It would be a risky process, and conditioning our children with artificialities and taking risks with their health is not something we humans like to do. We inherently give importance to pure nature (including anything from a natural environment such as gardens and rainforests, to natural, healthy babies) because its resources are what we depend on for survival. It is human tendency to get back to nature to find peace of mind.

However, it is possible that with the kind of advancements in technology we see today and with the exponential growth rate of technological progress, we will probably end up battling to survive against artificial life-forms that are gone out of control, sometime in the future. Most of us are sane enough to implement laws and standards to ensure that the technologies that threat our existence are kept in control. However, there are always the not-so-sane people who wreck it for the rest of us. For example, University of Texas at Dallas has come up with alcohol and hydrogen powered artificial muscles that are 100 times stronger than natural muscles [2]. These can be used to build 'more efficient mobile robots, flying vehicles, prosthetic limbs, lightweight exoskeletons, and artificial hearts [3].' While this is amazing news for people such as the physically disabled, there will always exist the possibility that some people will use this technology in the wrong way and let loose some out of control super-strong robots to destroy everything in their path. We are definitely capable of complete self-destruction. While it is easy to say that we need to think twice before introducing the kind of technology that is capable of vaporizing the surface of the Earth several times over (such as the atom bomb), we know that such things are unavoidable. If we are capable of inventing a technology, it is bound to surface sooner or later (and get into the wrong hands).

Self-replicable GNR technologies are indeed another serious threat to human life in the future. Suppose we program robots that can replicate themselves in the thousands, and suppose they require humanoids (or parts of them) as a resource. Who is to say that someone won't 'enhance' this creation by defining this life-form's existence and reproduction as a higher priority than a human's existence? If this happens, we are doomed. So we are more likely to destroy ourselves through one or more of our creations than to become extinct by merging with our technologies to evolve into artificial beings.

While we need to be careful before using technologies that can threat our very existence, I think it is in our nature to anxiously implement what we have invented and truly think about its consequences when it's too late. 'We can mass-produce all sorts of wonderful technology, but we forget to figure out how to un-do much of it... We can now create genetically engineered crops which have useful properties, but, oops, they got away and are creating havoc in the wild, because we didn't really understand how the natural cycles work. We can create nuclear bombs, but we forgot to invent the anti-dote, and, oops, suddenly some terrorist group has one of them' We invent a way of doing something, and it seems really useful at first, and we're really proud of ourselves, but we forgot to think through how it would really work within the eco-system of the world we live in, and we're incapable of cleaning up the mess [4].'

Re: Technotopia & the Death of Nature: A Student's Perpective
posted on 03/15/2004 4:58 AM by Red_Ruby

[Top]
[Mind·X]
[Reply to this post]

In reading this article, I have become aware of important issues that I have been oblivious to before. I had thought that self-replicating machines, artificial intelligence, and cybernetics were terms only used in science fiction books and movies. Imagine my surprise when I read that these are possibilities that can come into existence 30 years from now. (That's within my lifetime and here I am merely wishing to see flying cars!) But of course, these technological advances will have serious implications on life and on the future of our planet.
As much as we may want to stop this from happening, we cannot. Bill Joy's suggestion of halting technological advancement is a little too late and quite impossible to accomplish. First, it is late because we have already become dependent on technology. It has become a necessity and our daily life cannot be divorced from it. Second, it is impossible to stop advancement because the world hungers for it. Each sector of society seeks technological advancements for their own purposes. The scientific community is driven by the pursuit of knowledge. The economic community is driven by the ways to gain more profit. Governments, meanwhile, are driven by the need to control. The only way to stop it, as Kruzweil noted, is to employ a 'totalitarian, state-enforced ban.' But even this method may fail.
After much serious thought, I would have to agree with Kruzweil that the progress towards Singularity cannot be stopped. In fact, the more I think on it, I start to believe that all along the purpose of technology is to one day evolve into a human-like creature or to meld with the biological Man. Consider the following, what is technology being used for? It is used to improve the quality of life, mainly human life. We have computers to help us in our work, cell phones so that we can get in touch with each other, TV for entertainment, etc. Won't it be more convenient if machines can respond to our specific needs? If I am stressed, I do not want the TV to be showing me a suspense or action movie. A comedy would be more suitable to the situation. But this would imply that the TV must know how to interpret my facial expression and my body language. If it can read my emotions, is it not conceivable that it may also be able to express the same emotions? Even if the answer is no, at this point, it may not be far off before machines gain consciousness. And when they do gain awareness, do we consider them living or not? And what happens when we have improved all that surrounds us? The answer is simple. We start improving ourselves. It is a scary thought but true enough. Just look at some of the well-known celebrities, their material needs have all been fulfilled so they start improving themselves'a little face-lift here, enhancement there, and so it goes. But these changes are not so important compared to what technology can offer. There are so many possibilities in which technology can improve a person's abilities. A chip can be embedded in the brain to improve memory. A device embedded in the eye to provide 20/20 vision to those whose eyes are impaired. And there is so much more that the public would be willing to pay for. Therefore, I believe that Singularity is inevitable.
The question that is on people's mind, today, is whether the advent of Singularity will mean the death of nature. It is a tough question to answer because it is so vague. One is not exactly sure of what is meant by 'nature' and 'the death of nature.' Does 'nature' refer to living things and 'the death of nature' means the death of all living things? Are we contemplating the possibility of machines ruling the world and we call that 'the death of nature'? One is not exactly sure how to take the meaning of these words. But one thing is for sure, with all the wonderful promises of technology, I do not believe that Man can remain the way he is today. We will utilize this technology even if there is serious consequences. Research and experiments will continue but warning signs will also be heeded. Although technology affects everyone, in this case, it is ultimately the scientists who are responsible for ensuring that advancements can be kept in control. In the years to come, I believe that Singularity will be achieved without the public's realization. It will be a subtle change that will go unnoticed because of the ease with which new ideas are adopted by the people today. Surgical implants of ability-enhancing devices pose to be a very lucrative business. Meanwhile, people may be reluctant to adopt machines that have feelings. These are all speculations since we cannot know what the future will bring. Oh Brave New World be merciful on us!

Re: Technotopia & the Death of Nature: A Student's Perpective
posted on 03/15/2004 11:14 AM by grantcc

[Top]
[Mind·X]
[Reply to this post]

Creating a superior intelligence is one thing. Getting people to pay attention to it is another.

Re: Technotopia & the Death of Nature
posted on 11/15/2004 5:29 PM by tharshi

[Top]
[Mind·X]
[Reply to this post]

It is true that 'with great power comes great responsibility'. The power the humans hold through the increasing technological advancement is a huge responsibility that might lead to a potential risk that we have not come to be aware of.
The fast progressing developments in genetics, nano-technology and robotics are the signs of the coming Singularity. Although this may open doors to new creations that we may have never dreamt of; it also raises many concerns. One of the main concerns to be issued is 'Are we leading our own downfall?' Looking at our evolution over the years and the destruction we have caused to the very nature that has created us, it certainly is not a surprise that this should be of concern.
Today, the world is heavily consumed with the goodness of the technology that we fail to realize how much it has over taken our part in this world and the amount of destruction it has caused. We are already facing enough crimes through the existing technology to which we cannot find a solution. The continual growth would only make the crimes easier to commit. Another thing to take into consideration is that our society is already facing cut backs in employments due to technology. Within a few years, there maybe superior machines built to replace the current jobs. Then, where would the people earn their income from? How would we make a living?
Through the evolution of human bi-pedalism, we have surpassed other fellow species of this world. But with the fast paced technologies, we have also created a division among ourselves. The fact is, we have already stepped away from our own kind. The first world countries are exposed with the genetically modified food and other technologies as part of the everyday lives, whereas one third of the world has not even come across the concept of telephone, yet alone come to familiarize themselves to the idea genetically modified food or any other technological concepts. Because of this, it would be a safe assumption to make sure that nature would not be relinquished completely.
But what if it comes down to us extinct our own kind? The earth has been a mother for us over thousands of years and it did not take us long to harm it. There is no guarantee that our own creation will not get out of hands and one day, and wipe out the rest of the world. Many nations are already facing enough controversy among the different people over religion and land resources. The existing technology provides easy access to numerous amounts of weapons that they used to kill one other, now giving more power is like adding more fuel to the fire. It certainly is not going to make things better but worse.
The real enemy here is not the technology, but people who are pushing it to the limit. The technology is a great thing. It has done many good things for us. But it is we should also be wary and fear it, because sometimes we are not capable of understanding our own creation and its harmful nature, and use the technology as a tool for destruction. If anything, before we go ahead and continue with the technological progress, we should stop and reevaluate the purpose behind the creations rather than trying to explore the extent of our capabilities.

Re: Technotopia & the Death of Nature
posted on 11/15/2004 5:43 PM by mr migu

[Top]
[Mind·X]
[Reply to this post]

If the singularity is to occur, I see two different paths in which our existence will travel. Either in a positive manner, possibly towards a utopia, or a negative manner, towards a dystopia. The outcome to either one or the other will depend on how well we are able to control the technologies are that are being produced. If we are able to successfully control what technologies are being produced, the singularity may bring about a period where everything, from the worlds economy, to the labour needed to produce goods, will be controlled by machines. Also, it is possible that machines work with nature, and not just against it. It could be possible that machines be responsible for the survival of all natural things, and try to keep a consistent natural level in which the survival of all things is aimed for. In this world, humans may be left to focus on culture and their individual happiness, instead of dealing with the problems experienced in today's life.
Such an existence would be possible, so long as we do not allow any problems to grow out of hand with any developing technologies. For example, disaster could erupt from the genetic modification of human food sources. Unseen circumstances in these changes could end up upsetting the natural environmental balance and wipe out many life forms, which could end up with the extinction of many, if not the death of all nature. Unfortunately, I think too many of us are excluding human interference of nature, with natural occurrences.
This point of singularity can, instead of being looked at as the end of all nature, can be looked at as a point in which nature goes through a major evolutional leap. If we are at this point capable of developing machines which are in all ways superior to humans, or even able to upload the human consciousness to a machine, we are in effect creating superior humans. If a purpose of life is to propagate, to ensure the survival of the species, then could we not look at creating more intelligent, human-like machines with a much greater ability to survive as successfully completing this purpose.
Is there really much of a difference between a biological machine and a mechanical machine, if the mechanical machine is capable of doing everything the biological one is and more? If these mechanical machines are able to live under more extreme circumstances, and consume less resources, would this step of evolution not be a positive one? At the rate at which we are currently living, not only are we killing off many species, but we are consuming resources at a rate at which they are not renewable. If we are facing such a biological collapse, then why not try to evolve out of a biological state.

Re: Technotopia & the Death of Nature
posted on 07/14/2006 4:47 AM by dark&cold

[Top]
[Mind·X]
[Reply to this post]

we need to realize that we r the ones who have to die, we can afford it after all.