this post was submitted on 19 Feb 2024
98 points (85.0% liked)

Asklemmy

43757 readers
1742 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_[email protected]~

founded 5 years ago
MODERATORS
 

Assuming our simulation is not designed to auto-scale (and our Admins don’t know how to download more RAM), what kind of side effects could we see in the world if the underlying system hosting our simulation began running out of resources?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 8 months ago* (last edited 8 months ago) (1 children)

There would be no need to go that far at all times is what I'm saying. It's the equivalent of a game rendering stuff far away only when you use a scope. Why render everything at all times if it isn't being used and does not affect the experience. It would augment the overhead by an insane amount for little to no gain.

This is also just a thought exercise.

[–] [email protected] 1 points 8 months ago* (last edited 8 months ago) (1 children)

Why render everything at all times if it isn’t being used and does not affect the experience.

But how does the simulation software know when it needs to calculate that detail? If you are the only person in the simulation, it's obvious because everything is rendered from your perspective. But if it's more than one person in the universe, an ai program has to look at the state of the mind of everyone in the universe to make sure they aren't doing something where they could perceive the difference.

Am I microwaving a glass of water to make tea, or am I curious about that YouTube video where I saw how you can use a microwave to measure the speed of light. Did I just get distracted and didn't follow through with the measurement? Only something constantly monitoring my thoughts can know. And it has to be doing it for everyone everywhere in the entire universe.

[–] [email protected] 1 points 8 months ago (1 children)

The way I see it, it would be coupled with the tool and not the intention someone has with it. So every microwave would render it properly at all time, as well as most electronics just by their very nature, regardless of what the person plans to do with it.

Actually I think they can probably just approximate the microwave stuff and just keep the electrical tools rendering like oscilloscopes.

They only need to render for things that give an exact measurement, the microwave trick has a 3% tolerance which is huge in the scope of things.

It seems like a lot but it's less than simulating every single atom imo.

[–] [email protected] 1 points 8 months ago

It's more than electronics. Every piece of diffraction grating could be used to make a wave interference measurement. Every fiber optic line in the world- because bend it too much and the wave doesn't stay bound inside.

But that still doesn't get rid of the AI part because you need something watching to know when an electronic device is created by anyone everywhere in the universe and understand that that device is a type of device that could be used to reveal detailed measurements.