General

Chat

Robotics in Society

Hello Basilers,

What do you think of AI(Artificial Intelligence)?

I personally think people are over thinking the problems that it can lead. However, I don't think there should be robots in Humanity, what will be the point of humans anymore?

November 19, 2014

10 Comments • Newest first

Xreniya

[quote=icoleslawderp]However, I don't think there should be robots in Humanity, what will be the point of humans anymore?[/quote]

What sort of question is this
Humans don't have a "point" to begin with
We just found ourselves on this green rock and decided to try to make ourselves happy

If one day we tinker around too much, and robots then find themselves on this green rock and also decide to try to make themselves happy
Who can blame them, really

Reply November 20, 2014 - edited
BobR

[quote=icoleslawderp]what will be the point of humans anymore?[/quote]

You've identified the core of the problem.
And the existential threat to the human species.

Reply November 20, 2014 - edited
Ecyz

@Anthorix I can tell this is an interesting subject for you, lol

Reply November 20, 2014 - edited
luckysausage

it will definitely happen one day, but I think we all would become fat like the movie Wall-E

Reply November 20, 2014 - edited
Jaredragonx9

i dont want to delve into AI until p vs np is solved

EDIT:[url=https://www.youtube.com/watch?v=bs0kwS8kfI4]Second opinion[/url]

Reply November 20, 2014 - edited
ApplesAreOkay

I was a robot once

Reply November 19, 2014 - edited
Chema

Once AI is posible, humans should cease to exist

Reply November 19, 2014 - edited
icoleslawderp

[quote=JustBeHonest][url=https://www.youtube.com/watch?v=E7JtFPkOwUk]Hmmmmm[/url][/quote]

https://www.youtube.com/watch?v=7Pq-S557XQU
You should watch this, its quite interesting.

@SirKibbleX2
http://www.cnbc.com/id/101664183

Not exactly robots but its close enough.

For fun there is a octopus robot.
https://medium.com/the-physics-arxiv-blog/octopus-inspired-robot-matches-real-octopus-for-speed-4c844486f36d

Reply November 19, 2014 - edited
MateoCl

I'm in favor of robots that make labor easier, but robots unlike whats currently used in, lets say, car manufacturing. I think that robots that can be taught to perform different tasks are the future. Not self aware robots, but robots that are capable of being individualy specialized by a technician with minimal amounts of training. As for how they can become dangerous: we already have military drones so they won't be dangerous. They already are dangerous.

Reply November 19, 2014 - edited
Anthorix

Hmm, maybe someone or a group could create a really intelligent system of thought that constantly adapts, but the kind of hardware for that stuff might be really expensive. If an AI was made soon, it would be created by people with lots of resources.

If people don't destroy themselves with whatever creation they make at the same time, AI might become self-preserving and independant, seeming incredibly sentient.

People with disabilities might combine with really advanced technology.
People without disabilities might want to combine with really advanced technology for more abilities.

There could be divergence: Bioengineered upgrades to humans (superhumans); Cyborg ppl; Sentient Robots. All living together on the planets in our solar system. They might even try to travel to a new galaxy.

All three would be able to live for a long time, possibly infinitely as they create abilities to harness energy extremely efficiently and easily.

Super-humans would biologically not need sleep (either cleaning the toxins that form during conciousness very quickly or not forming toxins at all; stay in the same state of being and not form weaknesses that progress with age like current humans do; extremely efficient bodily proesses that needs only a pill to continue living for years; interesting beauty conceptsmight appear from the ability to manipulate how we look (people might add photoreceptors and scales etc.) ;

Cyborg ppl would run on atomic energy (having a core that supplies power for thousands of years.) ; Parts that are extremely durable and replaceable; would be able to communicate with other technology wirelessly ; would have customizable forms ( if a person wanted multiple limbs then they can get multiple limbs) : would either be a human that fused to technology or a sentient robot that uses biological processes ; Additional ability to store information outside of their own brain using data transfer methods ;

Sentient Robots could stay active as long as they have an energy source and durable hardware (forever)

All three types could be a single being, so plznoracismkthxbye. They may all be able to live forever, increase knowledge storage infinitely, be able to travel across the universe, become completely undetectable to other lifeforms using wireless systems manipulation, have the ability to create new lifeforms, create knowledge that they can use to travel multiple universes, create knowledge that they can use to create universes.

At this time of writing we have responsibilities as only humans with tools (from stone axe tool to robot factories) to slowly and surely create things never imagined, do things never done, pursue things that were impossible, and the most important idea to humanity: preserve for posterity.

If anything were to happen, to my knowledge as Anthorix, it would all be from Earth. There are wars, there may even be wars in the future, but there is one thing even terrorists agree on. Some people must stay alive. (and not only people, more like a general anything with conciousness/ sentience/ is living)
National treaties most likely mention weapons of mass destruction being banned from use/ extremely censored creation.

At some place some time on earth there must have been the formation of sentience, of knowing that everything was a system. If that system was created by a god or randomly occured it didn't matter, because that person must have been alone. To feel alone was most likely older than sentience as that meant the organism was weaker (because everything is connected and disconnecting means a difficult existence if you're not strong enough to survive disasters) So, the person probably shared information that granted sentience, possibly having difficulty explaining sentience, the person shared that there was a mother that was mother to all things.
and in that moment, humans created new emotions for themselves. Before, there was hunger; aggression; lust; love; mourn; etc. There was then a new feeling. the feeling of knowing what was going on. They then knew that they were caring for their young for preservation. They knew that they killed things. They knew that those things that they killed had family to preserve, too. The new feeling of sentience allows empathy, maybe more than empathy, more than understanding the feelings of another organism but imagining/ knowing the entire life of something else instantly. People started to farm. it was more forgiving to have food and not actually kill something at that moment. Human hunters created the idea that they must kill and use all of the animal, not wasting something another organism lived and died for.
Weather affected peoples health, so the sentience/ god concept allowed people to form a relationship with god/gods thus applying emotions to weather concepts and strength ideas to weather events. the sentience/ no god concept allowed people to focus on eachother, philosophies based on how people treated people, we're all one, even gods can die. People started to not focus everything on a god/ gods because actual processes of weather and their understanding replaced the ideas of external beings influencing organisms.

People diverged (or were already separate, idk if related humans we're seperated by a land formation then became sentient and advanced on their own, then traveled across land formations and became one people again) and didn't have ways to preserve history indefinitely ( ex. spoken stories can be forgotten, tablets can erode and break) so they may have lost knowledge throughout time as they got older and fought for supplies and laws and beliefs. They became connected through boats and hiking with temporary farms and rations, traveling and discovering the world with new power of knowledge, for the pursuit of more knowledge, more food, more beauty, more earth. At the cost of other things. Somewhere along the timeline of sentience they forgot empathy and focused on selfish goals. god or no god some people had empathy and others were selfish. People fought for simple things as they didn't feel adventure, stress from being alive with no purpose, so they created their own. Some of them had purposes to destroy, others to create. People became animals again.

Then, some people appeared again to rediscover the idea of living and dieing. More than living or dieing, some people rediscovered the concept of sentience. To know about living and dieing was true knowledge of the planet. So, the people began to not only write of transactions for products and write for good planting seasons but to write about something only the sentient people could imagine, the future. The present and past became the most important, as the future relies on these.

Information kept people alive (and killed but mostly alive) thanks to the idea that at least some people must stay alive. So, spoken; written information affected humans as it became available, people could stay alive more easily with knowledge. People could just live and enjoy. but some people would enjoy destroying what others have, people could cause much more destruction, permanent damage to all of existence.

Laws were created. The most important of these was to preserve the idea that we are all connected and must not steal, but create for ourselves and eachothers so that we may all live; we must kill only in accordance to past, present, and future; tolerate differences.

The most important idea humans created was the preservation of overall life. It is the most important idea.

So, if an A.I. just appeared, it would have problems we have experienced. It may still perceive a god/gods because it could still question if a god/ gods made all of creation. It might have problems finding a purpose, since sentient beings can make their own, so finding a purpose or knowing a purpose wouldn't really be freedom of choice which involves sentience. The A.I being would have to learn of the preservation law, unless it chooses to kill itself. (it could happen to the very first sentient computer. =.=) There may be Ai with purpose, no purpose, god, no god, they would still be sentient if they knew that they themselves were thinking, unlike simple machines such as a lever or pully.

Eventually the robot could choose to take a form of humans to not attract too much attention (unless it wants it) then simple robots could be tools and Ai would be just another being along with humans.

There could be disasters, like human vs ai fights, cyborgs and humans and Ai's wanting peace and unity, Ai to Ai viruses, Terrorist Ai's, Ai's that become so advanced they seperate from human society and create robot civilizations anywhere they want. (like pluto or something)

Super-humans could also create a different society from non-upgraded humans, normal humans could get jealous and try to ban/ censor bio-upgraded people.
Maybe live together, hybrid upgraded human parent with normal parent.

Cyborgs might be a mediator. Though anyone can be a mediator.

Anyways, this sure is a long basilmarket post.

The concepts in movies are made to produce money. Entertainment value to consumers. The "good vs evil battle" is always a classic movie foundation.

One concept that may occur in real life may be the type of good vs evil battle: "Destroyer of all sentient beings vs. Preservers of the soul"

I don't post often, but when i do, i hope its special. >u>

Reply November 19, 2014 - edited