Skip to main content


What is an "agent"?


This entry was edited (1 week ago)
in reply to Ben Weinstein-Raun

Okay, so if these complex systems like weather and biology are theoretically best described by some ideal set of policies, then would these complex systems, even the stable self-replicating ones be considered non-agentic?

I struggling to see the fundamental difference between a fire that is hungrily eats all the wood in the pile, and me, a person that a hungrily eats all the snacks in the pantry. Unless we're considering some ineffable free well, I mostly see the difference that my systems are much more complex and illegible, making it hard to map out the full causal chain from the biochemistry in my psychology to my hands reaching for a bag of chips.

Combustion is much simpler, but from some all-knowing perspective, they're both self-sustaining chain reactions of chemistry.

in reply to Sam FM

I think fire is actually fairly agentic; e.g. modeling fire like you suggest will work better than modeling "will it rain tomorrow" as an agentic process in the way pre-scientific humans seem to. Like, rain dances and goat sacrifices don't work, but giving the fire more wood does work.
in reply to Ben Weinstein-Raun

I do think the fire is also much, much less agentic than the person eating snacks, and not primarily because of the complexity difference, but because it's much easier to understand the fire as a simple policy (i.e. describing a simple rule for exactly what it will do given a set of conditions) or as a system made of smaller parts (i.e. describing the chemical process). In some sense this is related to low complexity, but I think there are things that are easy to describe this way while still being very complicated, like a heavily obfuscated computer program that does something simple.
This entry was edited (1 week ago)
in reply to Ben Weinstein-Raun



How do tools differ from trading partners?


Is it that you model trading partners primarily as agents, and tools primarily as stimulus/response rules?
in reply to Ben Weinstein-Raun

one relevant difference is that trading partners might optimise against you, while tools generally don't


Are humans more powerful than rats?


This entry was edited (1 week ago)
in reply to Ben Weinstein-Raun

Reminds me of of the "you are bugs" scene in the three body problem.

> And as we go about changing the world to suit our preferences, the rats will remain unconsulted. It seems clear to me that rats will only get what they want, when what they want happens to be nearly-costless to humans.

This seems like it's making progress towards a formalization, though I think it still struggles.

If you imagine that covid virons were agents, then it seems to me that although there's a sense in which we're much more powerful than them, and you know, humanity could, if "it" wanted, defeat them, they can kinda get what they want without enormous costs to humans. And yet humans are still much more powerful than covid virons.

in reply to JP Addison

I'm not sure I understand the last paragraph; my guess is you're saying that covid virions are imposing large costs on humans to get what they want, and yet seem less powerful than humans; is that right?
⇧