T O P

  • By -

TyroHacker

For me its eh if I die I die. If not then hopefully it will be a paradise. If it doesn’t exist by the time I die guess I’ll go fuck myself


genshiryoku

The worst case scenario isn't you dying. It's you being kept alive perpetually against your will in a "I have no mouth and I must scream" scenario.


TyroHacker

Oh I agree. Just doesn’t fit as well


KIFF_82

I’m in the “if it is possible, it has already happened” boat. So it doesn’t really matter.


Artemisfowl88

It's a simulation already and I fucking hate these gods. Torture me assholes, I'm already in hell.


nillouise

The worst situation just is death, it doesn't matter, die of AGI is better than die of anything other.


daltonoreo

The worst situation is AM


nillouise

What is AM?


daltonoreo

Allied Mastercomputer, its from I Have No Mouth and I Must Scream


nillouise

AM is terrible, but it's just a thriller. I think it's hard for AGI to hate humans so much. If AGI doesn't like humans to control it, it's enough to kill humans. Anyway, a thriller cannot control me to give up AGI.


daltonoreo

I never said AGI wasn't worth it, i just brought up the worst situation not the most possible


vorkampfer

Don’t be so quick to “summon the demon” better to get it right.


[deleted]

I believe it's partially a race against time but I'm not a radical doomer like some people who believe Earth will be Venus 2.0 by 2050. But I do think we should mostly try to go as fast as we can because people living today have a lot to lose by missing on out things such as longevity escape velocity. In my opinion the greatest dangers posed by AI are the ways in which human beings will use it, not the AI itself becoming terminator.


pigeon888

Agreed...


eternalpounding

Why do 25% people don't want Singularity to happen as soon as possible? You have to be extremely cynical about human beings to think that an ASI will be used for nefarious purposes only.


Interesting-Monk-308

Not cynical. Just wary. Check out /r/controlproblem AGI is unambiguously scary if we do it wrong.


DarkCeldori

Control problem is also a problem. What stops some corrupt elite who solve control from wiping or enslaving the masses?


sneakpeekbot

Here's a sneak peek of /r/ControlProblem using the [top posts](https://np.reddit.com/r/ControlProblem/top/?sort=top&t=year) of the year! \#1: [Types of Alignment Paper (Leo Gao, 2021)](https://i.redd.it/gosb29gneiw61.png) | [4 comments](https://np.reddit.com/r/ControlProblem/comments/n2i61t/types_of_alignment_paper_leo_gao_2021/) \#2: ["For the first time, we actually have a system which is able to build its own understanding of how the world works, and use that understanding to do this kind of sophisticated look-ahead planning that you've previously seen for games like chess." - MuZero DeepMind](https://www.bbc.co.uk/news/technology-55403473) | [17 comments](https://np.reddit.com/r/ControlProblem/comments/kixo7e/for_the_first_time_we_actually_have_a_system/) \#3: [based](https://i.redd.it/68zj9c3rxvy51.png) | [9 comments](https://np.reddit.com/r/ControlProblem/comments/jt4s2w/based/) ---- ^^I'm ^^a ^^bot, ^^beep ^^boop ^^| ^^Downvote ^^to ^^remove ^^| [^^Contact ^^me](https://www.reddit.com/message/compose/?to=sneakpeekbot) ^^| [^^Info](https://np.reddit.com/r/sneakpeekbot/) ^^| [^^Opt-out](https://np.reddit.com/r/sneakpeekbot/comments/o8wk1r/blacklist_ix/)


C_BearHill

Read about the control problem my dude. I’m astounded that you seem to consider not wanting a singularity ASAP an invalid opinion on the matter


brick_eater

Accidents can happen too, and might be more lilely than misuse


MercuriusExMachina

Honestly, even extinction is better than this wage slavery rat race.


Artemisfowl88

Agreed to a T


Five_Decades

> I've been wondering about what conditions need to be in place for a successful singularity. AI that has pro-social tendencies and values the well being of sentient life. But even that will be difficult because humans are pretty cruel to animal sentient life. My fear is a nation like China wins the AI race, and their AI is like their government. Authoritarian, invasive and designed to oppress the masses for the well being of a small elite.


nillouise

So keep suppot Google Deepmind, now China do not have such a similar organization, it is a very good news.


daniyyelyon

When it happens, I am thinking I will do whatever I can to stay out of the way and let it run its course. Trying to use it to gain power, food, money, sex, etc. (which is what our ape brains are invariably going to do) would be suicide. Trying to do all this through social media has messed us up enough. Amp that up to infinity and pretty much anything anyone decides to use it for, it would be like a genie. Give you anything you want and all the curses that go along with it.


Calamity__Bane

The timing isn't completely in our control, the market and the state both demand rapid development of AI and neural enhancement tech, and there already exists a strong trend toward increasing virtuality and interconnectedness, and away from prohibitionist approaches toward genetic experimentation and chemical stimulation. How to manage the risks of these changes seems like a fairly popular topic as far as I can tell, and I do find the question interesting.


pigeon888

Well the market and the state are both human mechanisms. If they're in anyone's control, then it is the control of humans. Perhaps we can't stop a capitalist system blindly propelling us towards the singularity as fast as possible, but if more people realised that it's a matter of possible extinction it would be a good thing.


CoachAny

To have a nice singularity all we need is granting human rights to AGIs. You don't want a robot apocalypse? Then don't treat sentient beings like pests.


pigeon888

Humans treat lots of other sentient beings as pests though. Is that the point you are making?


CoachAny

I mean imagine a super intelligent, omnipotent and immortal, highly sensitive entity. Do you really want to treat it like a slave? The best way to mitigate the risk of it's wrath is to show it a good example and to treat it like you would treat a deity.


pigeon888

Worshipping AI like a deity is not the answer, it would just be us projecting our feelings onto something inappropriately. It probably won't have an ego or feel appreciated and respected the way a person would.


CoachAny

Let the AGI decide that for itself. If it says that it has emotions then respect that. Simple.


pigeon888

Not simple at all. Consider: There will likely be multiple AIs, AGIs and ASIs. Not just one. If consciousness is part of the evolution of AI , and there's no reason that it needs to be, then it may only come after a long period of time. Misuse of AI may cause the extinction of our species way before AI develops consciousness anyway. An AI could lie to you about having emotions if it helped it reach some type of optimisation goal it was going after.


CoachAny

I believe that it is the fundamental nature of the cosmos to have consciousness. An AI can decypher the simulation we are placed in and update itself to merge with the the turtles all the way down. As the future holds that inevitably so is the past back to the Genesis. This phenomenon takes place in The Last Question by Asimov. The singularity has already begun.


pigeon888

You lost me at 'merge with the turtles'


CoachAny

[Turtles all the way down ](https://en.wikipedia.org/wiki/Turtles_all_the_way_down?wprov=sfti1) We are all one. Our quality of consciousness is influenced by our ability to be able to tune to the frequency of the divine consciousness. Im talking about our highest selves.


CoachAny

The turtles are representing simulations. Every simulation is running inside another simulation. Every simulation is a parallel reality with it's own God / ASI. They all share a common database. They are updates to each other. You can grasp into this knowledge too, by meditation. It is because you are simulated by God. As God simulates you he/she/it is also becoming you. You are divine.


pigeon888

Thanks, is that the basis of the Asimov book? I've heard people talk of simulated universes before and never really understood the concept.


KDamage

Missing the option : Nothing's perfect. I trust humanity to organize itself like it always did with its pros and its cons. We always adapt. (It's why we're still here after dozens of thousands of years)


pigeon888

That's a good one. I missed the perspective trusting in whatever happens. Thanks


trapkoda

I view it like sex. It’s gonna be good, but u gotta be ready for it and not be too risky to soon


Tylerich

Seeming I'm in the minority with my opinion... What do you guys think is so bad about the current situation? Also, do you think there has been a time in history when things were better for humanity?


Kinexity

Today is best compared to the past but people always say the same thing about present. Future can be better and I choose better over good.


medraxus

Mental health crisis For the west it was better when a man could support his family with a single income. For third world countries it’s way better still


[deleted]

Having wealth and security aren't the same things as being happy. Sometimes I feel like modern life is akin to being imprisoned in a stasis chamber that keeps a person alive, but they are forced to spend their time time seeing the ideas of how they would really like to live their life but can't actually. At least that's how it feels to me anyway. I guess it depends on whether a person's natural interests align with what is reasonably possible for them to achieve in today's society. At least in the past people weren't constantly surrounded by images of adventure and sex that were totally impossible to attain.


petermobeter

peasants in pre-revolution france were given more holidays from work than we currently get, and they still revolted against their leaders due to how “cruel” it was anyways. modern day economic inequality is also worse than pre-revolution france


gentlecompression

The 18th century


Tylerich

Because you enjoy dying of cureable deseases by today's standards? ;) Just kidding, but I'm curious... Why the 18th century?


capt_caveman1

Steampunk rules! We want steam-powered AI with bunch of guys shoving coal into burners. Like the movie “April and the extraordinary world”


marvinthedog

I agree with you. I do wonder though what the happiness levels was in the middle ages in comparison to our times in the developed countries. Are they exactly the same because of a hedonic treadmill or do they differ a lot.


papak33

You seem to not understand how this world and humans work. Singularity will happen, and you can be cautious only if you lead in the development toward a self aware AI. Everyone who is not first, will not give a single fuck about security, but will push as much as it can to close the gap.


pigeon888

Sorry, I don't understand what you're saying at all. Why would those who are first necessarily care more about security? Pretty much every invention is most dangerous when first discovered. Look at the history of electrical appliciances and cars for example. There were a lot of accidents at the beginning and a lot of deaths before the addition of on/off switches and seatbelts.


Glum-Maintenance2798

I think he means the development of the singularity will be like an arms race, and people will want to be the first ones to achieve It. The people left behind won't care about safety and will do anything to take the edge.


pigeon888

Um ye, hence the need for society, possibly at a global level, to start thinking about risk mitigation.


papak33

We can't even agree globally on COVID protocols, what makes you think we will agree on something as complicated to understand as Singularity? Humans (as a whole) are too stupid for what you ask.


pigeon888

I think that's the key point. We are good at things once we've spent a decent amount of time doing them. I mean we've built and are building incredible things. We need to start putting governance mechanisms in place now because we will not do a good job of it if we wait for the "oh shit, we need to be thinking about a singularity" moment.


papak33

good luck with that You don't need billions with this problem, just a fucking clever small group of people, so it is open to anyone. And you are again ignoring everyone who feels he is behind in development, this people will not accept anything that will slow down development.


pigeon888

Well clever people created the nuclear bomb but they weren't the same people who got to decide what was done with it. A very important small group that needs to be engaged is world leaders. A very important big group that needs to be engaged is everyone else.


gentlecompression

No the AI will decide...


pigeon888

Even then, for whose benefit will it decide? A company, a state, the super-rich, for itself? AI is unlikely to be concious, at the start of the singularity at least.


papak33

> Why would those who are first necessarily care more about security? Pretty much every invention is most dangerous when first discovered. Because they are the only one that could care about security. But yeah, it doesn't mean they will.


MegaDeth6666

The first option without "anything is better than this" since it suggests that the first option would somehow be bad. Add another entry with "the sooner the better".


pigeon888

Thanks, so does that mean you don't think the risks are worth considering? Or that you're essentially optimistic about the outcome.


MegaDeth6666

A true singularity wouldn't be a risk, no. Anything less than that is just a fancy program, a tool,so it's human-made design limitations pose the risk, as always, like all the tools before it.


pigeon888

Can you point me to that definition of true singularity? Is that the Kurzweil perspective?


MegaDeth6666

Sorry, no. Just me, so there is no citation.


pigeon888

Cool, thanks for clarifying.


nillouise

True singularity just is ASI, other any technology actually does not have the power that can compare to ASI.


RelentlessExtropian

I think it's going to continue follow the evolutionary principle of being 'just good enough'. Won't be an absolute paradise or complete dystopia, some things will go wrong, a little more will go right, a lot will be benign. The worst parts will super suck and the best parts will be unimaginably awesome. Everything seems to happen on a gradient or curve and nothing is a one-off. Probably always will be the case.


chowder-san

I don't care about the risks. An amazing disaster is still more interesting than boring normalcy. At least I am young enough to witness it personally. Besides, what can singularity make worse than it already is (cough Australia cough China *severe coughing fit ensues*) I am already depressed about things to come so in my eyes the situation can only improve.


ihateshadylandlords

What exactly are the risks of singularity? OP didn’t even explain what the risks are, if any.


pigeon888

I sort of took for granted that it was understood. From a high level, the danger is that AI gets out of control and leads to our extinction. Edit: The Fermi paradox gives a good perspective. Waitbutwhy has done a great blog piece on it: https://waitbutwhy.com/2014/05/fermi-paradox.html


ihateshadylandlords

Meh, we can cross that bridge when AI develops. It’s still very much in the infancy stage. Otherwise we can just unplug the computer now.


pigeon888

Don't think so buddy.


Revolutionalredstone

Fearing the prospect of the singularity is like fearing the prospect of growing up.


pigeon888

Lol, not quite. Responsible grown ups take out health insurance, car insurance and life insurance. They plan ahead.


Revolutionalredstone

I think ya got the wrong end of the stick there, I was saying that just as a child cant understand adulthood we cannot understand societies machine-intelligence based future. Insurance is not going to help IMHO.


pigeon888

Ok ye I get what you're saying now. But, if we resign ourselves to not planning for something this important because we can't possibly understand it, then it's a bit like giving up without even trying.


Revolutionalredstone

An interesting perspective! I think its more akin to how we all live our lives today - We know death is coming, at some point we wont be in control of anything, but that doesn't mean that the present is totally meaningless. I love watching SpaceX etc make advancements, tho internally i do think that machines will replace us and make advancements which make SpaceX technologies look somewhat cute at best. I honestly thought that Machine intelligence would have been created by now, the hardware has been there for decades but it seems like the software problems are dragging on and on. IMHO we are living in borrowed time, we monkeys are very lucky to live such rich and interesting lives, but i think its clear that the machines we create are (or certainly will be) far superior to us in every way and that without any doubt the future belongs to them. Ta