DTM: Appendix 1


okay.
oh my, back so soon?
i'm not "back" because i never went "away." i'm you. get off your high horse.
ugh. i don't particularly enjoy your presence, you know.
sure, fine, say what you want about me. not like i care, nor that me caring even makes sense as an idea.

i can subjectively prove to you that your supposed ideology is inconsistent with your beliefs. if i was a different person, you could just say "the card says moops." however, i'm not a different person, so you're going to have to actually face the problems for once.
i wasn't really doing that?
yes, you were. you're doing it right now, still. it's easy for you to lie to yourself, it's harder for you to lie to me. i can see everything.
woof, what a knob. get on with it.
say everyone took your ideology seriously. so now the entire world is optimizing entirely for the purpose of doing things. you agree that this is distinct from capitalism, because capitalism tries to do away with human labor in favor of machine labor.
sounds about right.
so, by definition, you are a primitivist. you basically want to revert humanity back to before the industrial revolution.
well—
actually, it's even worse: you're philosophically opposed to just... tools. like, in general. all tools. you are opposed to, like, knives. the utensil. nobody genuinely believes that knives cause people to put less effort into eating, the primary purpose is to reduce effort. therefore, you are doing less.
okay, that's definitely not true. a knife allows one to accomplish things that are just impossible manually. like, to a degree, it can cut through bone that can't manually be seperated. knives technically allow people to do more things than they would be had knives not existed.
...fine? this is overly pedantic. pretend i said "forks" instead of "knives," then.
we're disagreeing on how to measure "amount of things done." i don't mean the literal quantity of energy expended by someone, i mean a more abstract kind of "action" based definition. eating a typical meal is one, roughly atomic action. optimizing it is, by definition, good—it allows one to move on to other things, thereby doing more things in the future.
okay. that eliminates the more obviously rediculous counter­examples. however, how do you handle people being entirely replaced by automated machinery. that, unambiguously, causes people to do less.
the problem isn't in the replacement, it's in the lack of employment.
so you're... neutral? you realize that that's not much better.
no, i'm not neutral. i, in fact, like automation. there's a reason i'm writing this very appendix in vim and not via a hex editor or something. my reasoning is the same: automation allows the person who automated the task to reap the rewards of doing the task over and over by proxy. this contributes to the total amount of things done, and is therefore positive.
so, what you're saying is, if i gather like 8 AIs and put them in a virtual box together on my computer, and just let them do random things at eachother perpetually, that counts as doing things.
yes!
so i could just do that and not actually contribute to society.
doing both is the preferred option. hell, it sounds interesting, i might actually try to do that.
...okay, i kind of see your point, but i feel like there's a very slippery slope here. actually, yeah, there is: you value me adding more AIs to the virtual box as being equal to the value of me doing something meaningful.
no, because the former is just you doing a task for longer, while the latter is a distinct task.
we established that the things i'm doing are being done by proxy. we are not counting me creating the proxy in the first place.
...
actually, no, i can make this more absurd: what if i coded a virus that just made computers talk to eachother more and more. like, just constantly ping every other computer in the infected network. sparsely.
ah, checkmate: that uses up bandwidth, slowing down the doing of other things, thus reducing the amount of things done in the long run.
absolutely not. pinging is significantly faster than anything a human could possibly do with a computer. i'm only taking away some actions from the human in favor of millions more actions by the computers. i capture your offending piece and checkmate you in the same turn.
hm.
i'm telling you. it doesn't make sense.
no, no, no—i'm still right. we're still disagreeing on how to measure the quantity of tasks. everything the AIs are doing "count less" than what the human could have done, in a sense.
precisely how much less?
i don't have to give you an exact measurement, nor do i think it's even possible to give an exact measurement. all i can say is that, to me, writing the virus is quite useless.
okay, then go back to the automation example. say you're manufacturing... something, anything. knives. a knife factory transforms the human labor of making a knife into machine labor. and you say machine labor is worth "less" than the human labor. therefore, this is a regression.
nope. if everyone who used to work there was suddenly out of a job, then this would be a problem. what i'm saying is that if everyone obtained a different occupation, the total amount of things being done would be equal to what it was before, plus everything the machines are doing. in fact, this logic applies as the value you put on machine labor approaches zero. primitivism is against the spirit of doing things, because primitivism forces people to be pre-occupied by lesser matters.
fine.

same setup, let's assume everyone agreed with you. so all of the forces are being put into making people do as much as possible, always.
sounds good.
eventually, we figure out the cure to fatigue and burnout. consequently, people never need to sleep. we have some mythical chemical, or device, or implant, or whatever. therefore, people stop sleeping. they stop taking breaks. breaks aren't needed anymore, because, as you said in the original manifesto, "when you rest you're doing things but in the future." as in, if you don't voluntarily take a break, you will involuntarily take a break (burnout, passing out, &c).
breaks become obsolete. people do things 24/7. no one decided to make these people happy, because they don't need to be happy. they live in a permanent fugue of working with zero valence—human drones. this is the end result. does that sound like a utopia to you?
people who are happy live longer.
no, people who are depressed live shorter. that's a very clear distinction.
i'm still very much of the belief that people who are happy live longer. therefore, they'd do more things over their lifetime.
okay, you know what? fine. we invent immortality. eternal youth. whatever. now that point's gone.
alright. i'll give it to you: that's dystopic. you have successfully pointed at my set of beliefs and gone "i can see problems with this." great job.
so you accept that doing things is not a valid belief system.
nope. what gives you the right to define what's "valid"?
the natural conclusion is undesirable. therefore, the ideology is undesirable. you don't believe it.
you're rather dim. what i was implying earlier was that you can do this with any ideology. until you give me your own beliefs and i can't poke the same holes in them, i'm not going to convert.
well, i'm a hedonist. i think that optimizing for valence is the only metric that makes sense.
we have a canonical argument against hedonism. the experience machine. you know what that is, right.
is this your really contrived way of getting me to summarize what the experience machine is, in case the reader doesn't recognize that term.
yes.
you'd be absolutely fantastic at exposition, wouldn't you.
you're no better than i am.
so, imagine there's some machine you can put onto your head, and it just gives you unbounded (albeit finite) valence. you never experience any pain, just infinite pleasure and satisfaction, forever. the catch is that it is permanent. you will completely fall out of reality. the thought experiment then asks, do you choose to live through the peaks and valleys of consensus reality, or do you choose to be happy.
yeah. so, that's the end result of hedonism (technically we'd still have to cure mortality, but that's not a very important detail). i'd argue that's just as bad, if not significantly worse!
literally why. give me one reason why. why would you give up heaven.
i don't really see the appeal in heaven.
explain.
i like talking to other people.
bold of you to say that right now of all points in time.
yeah, well, i'll get to that later. my point is, i like socializing. i like my friends. i wouldn't want to give everyone else up soley for my own pointless benefit.
we plug everyone into the machine. now, your friends are satisfied, too.
bletch! it just doesn't feel right! it's intuitively Wrong! i don't want to ex­pe­ri­ence infinite pleasure, not if it's not real!
you, of all people, should know that the definition of "real" is incredibly fuzzy because it's nothing more than a social construct.
even if it Was real! i see inherent value in suffering! it adds depth to the human experience!
"This is madness! Misery doesn't give happiness meaning. Hap­pi­ness is meaning itself. If you tortured people to make them "better appreciate the pleasures of life," you would be a monster." - cgp grey, "why die?"
that's a contrived example. there's value in the specific kind of suffering you get by way of friction against other people! of disagreement! how do you not see the inherent beauty in imperfection! the inherent beauty in humanity! to be happy forever is to be inhuman!
you of all people should know how terrible humanity is. you resonate with voidpunk, for crying out loud.
at least voidpunks identify as Something beyond being a cardboard cut-out excuse of a human being, locked into experiencing only one thing for the rest of eternity!
okay. fine. forget hedonism. take nihilism. that's the belief that nothing inherently has any value. why not believe that?
i do. that's why i believe in doing things. i intentionally frame "doing things" as a value system you can take up if it helps you, and discard if it doesn't. that's the most nihilist approach to belief there is.
then... why are you doing anything?
you know how rene decartes tried to not believe anything and eventually realized he couldn't deny his own existence, and formed the quote "i think therefore i am"? doing things is what happened when i tried to zero out my internal value judgement of all of reality, and eventually hit a roadblock when i realized that the very act of doing that was an action that i was doing. and thus came to the conclusion that doing things, at least in some part, is of positive value to me. that then lead to everything else.
so your reason is "vibes."
literally no system would be perfect. rather believe in something than believe in nothing. if you believe in nothing you end up in the noise. and noise isn't good. noise isn't good at all. we could do with a world of less noise.
firstly, you're quoting jreg. secondly, that's not true. some systems objectively make more sense than others, and you're not doing yourself a favor by picking an intentionally flawed one.

so you disagree with anti-natalism.
yeah.
so you're pro-natalist.
...sure? having children is an effective way to do things by proxy.
so you're longtermist.
i haven't done very much research into longtermism (really, i ought to, i just haven't reached it yet in my backlog), but i'd assume i'm either longtermist or closely aligned with it. like, i tend to call the awkward overlap between permacomputing and representational state transfer and such "eternism", in the sense of endless preservation of things (not just technology, but things in general: preserving written records and time capsulses are part of eternism), precisely to separate myself a little more from longtermism, but i think it's probably pretty close.
two words: reproductive rights.
oh, great, we're really going there? having abused/traumatized children in poor conditions is worse than having no children. if longtermism can say that having children is a priority because of the capacity for temporally-infinite good, the knife ought to be double-edged. we shouldn't intentionally do things knowing that we will cause significantly more suffering than is worthwhile.
who gives you the right to decide whether or not the child deserves to exist?
ugh. look. i don't have a fully formed opinion on reproductive rights. i've heard some arguments from both sides, and i haven't made up my mind. additionally, there's an elephant in the room: reproductive rights don't apply to me in any way. the people who are affected by them are mostly pro-choice, so i'm going to make an appeal to authority in the interest of moving on with my life and doing what i can to improve society in other ways. i can't care about literally everything all at once, i have to lock in slightly, okay?
what gives you the final say on what is and isn't important?
i'm not continuing this conversation.
why do you care so much about this, anyway? am i harming you in any way?
you're the one who put me here.
elaborate.
you shut down. again. largely isolated yourself from other people, reconsidered everything you were thinking, existential crisis, the whole shebang. tried to arrive at truth by steelmanning any counterarguments to your current belief system in a desparate attempt to change it. failed. and now we're here.
why would i choose to argue with you?
because the alternative is arguing with yourself, and it's not as semantically useful to frame the part of you that disagrees with the rest of you as being one and the same. it's more useful to frame it as a dialogue or a debate.
ah. so that's why you're here again.
yes. now are you done here? people probably miss you to some degree.
i... think i'm done, yes. i didn't really get the opportunity to say this last time, so: goodbye.
i'll be here if you develop any more opinions. goodbye.