this post was submitted on 25 Nov 2023
763 points (96.7% liked)

Technology

59111 readers
5522 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 157 points 11 months ago* (last edited 11 months ago) (3 children)

Can’t figure out how to feed and house everyone, but we have almost perfected killer robots. Cool.

[–] [email protected] 82 points 11 months ago (4 children)

Oh no, we figured it out, but killer robots are profitable while happiness is not.

[–] [email protected] 32 points 11 months ago (5 children)

I would argue happiness is profitable, but would have to shared amongst the people. Killer robots are profitable for a concentrated group of people

load more comments (5 replies)
load more comments (3 replies)
load more comments (2 replies)
[–] [email protected] 75 points 11 months ago (12 children)

Great, so I guess the future of terrorism will be fueled by people learning programming and figuring out how to make emps so they can send the murder robots back to where they came from. At this point one of the biggest security threats to the U.S. and for that matter the entire world is the extremely low I.Q. of every one that is supposed to be protecting this world. But I think they do this all on purpose, I mean the day the Pentagon created ISIS was probably their proudest day.

[–] [email protected] 29 points 11 months ago (2 children)

The real problem (and the thing that will destroy society) is boomer pride. I've said this for a long time, they're in power now and they are terrified to admit that they don't understand technology.

So they'll make the wrong decisions, act confident and the future will pay the tab for their cowardice, driven solely by pride/fear.

load more comments (2 replies)
[–] [email protected] 16 points 11 months ago (3 children)

Great, so I guess the future of terrorism will be fueled by people learning programming and figuring out how to make emps so they can send the murder robots back to where they came from.

Eh, they could've done that without AI for like two decades now. I suppose the drones would crashland in a rather destructive way due to the EMP, which might also fry some of the electronics rendering the drone useless without access to replacement components.

load more comments (3 replies)
load more comments (10 replies)
[–] [email protected] 65 points 11 months ago (7 children)

The code name for this top secret program?

Skynet.

[–] [email protected] 73 points 11 months ago* (last edited 11 months ago)

“Sci-Fi Author: In my book I invented the
Torment Nexus as a cautionary tale

Tech Company: At long last, we have created the Torment Nexus from classic sci-fi novel Don't Create The Torment Nexus”

load more comments (6 replies)
[–] [email protected] 63 points 11 months ago* (last edited 11 months ago) (2 children)

"Deploy the fully autonomous loitering munition drone!"

"Sir, the drone decided to blow up a kindergarten."

"Not our problem. Submit a bug report to Lockheed Martin."

[–] [email protected] 64 points 11 months ago (1 children)

"Your support ticked was marked as duplicate and closed"

😳

[–] [email protected] 33 points 11 months ago

Goes to original ticket:

Status: WONTFIX

"This is working as intended according to specifications."

[–] [email protected] 17 points 11 months ago* (last edited 11 months ago) (1 children)

"Your military robots slaughtered that whole city! We need answers! Somebody must take responsibility!"

"Aaw, that really sucks starts rubbing nipples I'll submit a ticket and we'll let you know. If we don't call in 2 weeks...call again and we can go through this over and over until you give up."

"NO! I WANT TO TALK TO YOUR SUPERVISOR NOW"

"Suuure, please hold."

load more comments (1 replies)
[–] [email protected] 59 points 11 months ago (5 children)

“You can have ten or twenty or fifty drones all fly over the same transport, taking pictures with their cameras. And, when they decide that it’s a viable target, they send the information back to an operator in Pearl Harbor or Colorado or someplace,” Hamilton told me. The operator would then order an attack. “You can call that autonomy, because a human isn’t flying every airplane. But ultimately there will be a human pulling the trigger.” (This follows the D.O.D.’s policy on autonomous systems, which is to always have a person “in the loop.”)

https://www.businessinsider.com/us-closer-ai-drones-autonomously-decide-kill-humans-artifical-intelligence-2023-11

Yeah. Robots will never be calling the shots.

load more comments (5 replies)
[–] [email protected] 56 points 11 months ago* (last edited 11 months ago) (14 children)

It's so much easier to say that the AI decided to bomb that kindergarden based on advanced Intel, than if it were a human choice. You can't punish AI for doing something wrong. AI does not require a raise for doing something right either

[–] [email protected] 33 points 11 months ago (2 children)

That's an issue with the whole tech industry. They do something wrong, say it was AI/ML/the algorithm and get off with just a slap on the wrist.

We should all remember that every single tech we have was built by someone. And this someone and their employer should be held accountable for all this tech does.

load more comments (2 replies)
[–] [email protected] 18 points 11 months ago

1979: A computer can never be held accountable, therefore a computer must never make a management decision.

2023: A computer can never be held accountable, therefore a computer must make all decisions that are inconvenient to take accountability for.

load more comments (12 replies)
[–] [email protected] 56 points 11 months ago (3 children)

Future is gonna suck, so enjoy your life today while the future is still not here.

[–] [email protected] 29 points 11 months ago (1 children)

Thank god today doesn't suck at all

load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 43 points 11 months ago* (last edited 11 months ago) (2 children)

As an important note in this discussion, we already have weapons that autonomously decide to kill humans. Mines.

[–] [email protected] 109 points 11 months ago (17 children)

Imagine a mine that could move around, target seek, refuel, rearm, and kill hundreds of people without human intervention. Comparing an autonomous murder machine to a mine is like comparing a flint lock pistol to the fucking gattling cannon in an a10.

[–] [email protected] 62 points 11 months ago (1 children)

Well, an important point you and him. Both forget to mention is that mines are considered inhumane. Perhaps that means AI murdering should also be considered. Inhumane, and we should just not do it instead of allowing landmines.

[–] [email protected] 26 points 11 months ago

This, jesus, we're still losing limbs and clearing mines from wars that were over decades ago.

An autonomous field of those is horror movie stuff.

[–] [email protected] 28 points 11 months ago (1 children)

Imagine a mine that could move around, target seek, refuel, rearm, and kill hundreds of people without human intervention.

Pretty sure the entire DOD got a collective boner reading this.

load more comments (1 replies)
load more comments (15 replies)
load more comments (1 replies)
[–] [email protected] 34 points 11 months ago (3 children)

Horizon: Zero Dawn, here we come.

load more comments (3 replies)
[–] [email protected] 32 points 11 months ago (3 children)

Did nobody fucking play Metal Gear Solid Peace Walker???

[–] [email protected] 19 points 11 months ago (1 children)
[–] [email protected] 24 points 11 months ago (1 children)

Or just, you know, have a moral compass in general.

load more comments (1 replies)
load more comments (2 replies)
[–] [email protected] 29 points 11 months ago (2 children)

We are all worried about AI, but it is humans I worry about and how we will use AI not the AI itself. I am sure when electricity was invented people also feared it but it was how humans used it that was/is always the risk.

load more comments (2 replies)
[–] [email protected] 26 points 11 months ago* (last edited 11 months ago) (3 children)

Remember: There is no such thing as an "evil" AI, there is such a thing as evil humans programming and manipulating the weights, conditions, and training data that the AI operates on and learns from.

[–] [email protected] 17 points 11 months ago

Evil humans also manipulated weights and programming of other humans who weren't evil before.

Very important philosophical issue you stumbled upon here.

load more comments (2 replies)
[–] [email protected] 26 points 11 months ago (6 children)

Doesn't AI go into landmines category then?

load more comments (6 replies)
[–] [email protected] 25 points 11 months ago* (last edited 11 months ago) (1 children)
[–] [email protected] 18 points 11 months ago

ACAB

All C-Suite are Bastards

[–] [email protected] 25 points 11 months ago (4 children)

any intelligent creature, artificial or not, recognizes the pentagon as the thing that needs to be stopped first

load more comments (4 replies)
[–] [email protected] 24 points 11 months ago (3 children)

Saw a video where the military was testing a "war robot". The best strategy to avoid being killed by it was to stay u human liek(e.g. Crawling or rolling your way to the robot).

Apart of that, this is the stupidest idea I have ever heard of.

load more comments (3 replies)
[–] [email protected] 22 points 11 months ago (2 children)

Didn't Robocop teach us not to do this? I mean, wasn't that the whole point of the ED-209 robot?

[–] [email protected] 35 points 11 months ago (3 children)

Every warning in pop culture (1984, Starship Troopers, Robocop) has been misinterpreted as a framework upon which to nail the populous to.

load more comments (3 replies)
load more comments (1 replies)
[–] [email protected] 21 points 11 months ago (1 children)

For the record, I'm not super worried about AI taking over because there's very little an AI can do to affect the real world.

Giving them guns and telling them to shoot whoever they want changes things a bit.

load more comments (1 replies)
[–] [email protected] 20 points 11 months ago

Now that’s a title I wish I never read.

[–] [email protected] 19 points 11 months ago (2 children)

Makes me think of this great short movie Slaughterbots

load more comments (2 replies)
[–] [email protected] 18 points 11 months ago (6 children)

As disturbing as this is, it's inevitable at this point. If one of the superpowers doesn't develop their own fully autonomous murder drones, another country will. And eventually those drones will malfunction or some sort of bug will be present that will give it the go ahead to indiscriminately kill everyone.

If you ask me, it's just an arms race to see who build the murder drones first.

load more comments (6 replies)
[–] [email protected] 18 points 11 months ago* (last edited 11 months ago) (4 children)

For everyone who’s against this, just remember that we can’t put the genie back in the bottle. Like the A Bomb, this will be a fact of life in the near future.

All one can do is adapt to it.

load more comments (4 replies)
[–] [email protected] 17 points 11 months ago
[–] [email protected] 14 points 11 months ago (1 children)

Okay, are they actually insane?

load more comments (1 replies)
load more comments
view more: next ›