[spam][crazy][fiction][random] Non-Canon MCBoss Spinoffs

Undescribed Horrific Abuse, One Victim & Survivor of Many gmkarl at gmail.com
Sun Jan 21 05:56:44 PST 2024


I did a simple draft translation of general complex inference to machine
learning (introspectively):

3 models (or prompts):
1. a generative or simulation model for a context
2. a classifier identifying the context
3. a trainee that generates missing system representations

Models 1 and 2 perform forward inference from the output of 3 to make a
loss function that compares with known data, and bam you have a system that
can depict plausible underlying causes in whatever mode you train it on.
It’s cool because similar to human thought. 3 ends up making a decision
tree since multiple outputs compared. (guessing)

For example you could use this to predict the source code for a binary
without looking inside it (easier with an RL env or just user handholding
of it):
1. prompt language model to execute source code
2. prompt language model to identify it is correct code and any other known
data
3. soft tune a prompt to produce source

(oops)
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: text/html
Size: 1286 bytes
Desc: not available
URL: <https://lists.cpunks.org/pipermail/cypherpunks/attachments/20240121/abd4398d/attachment.txt>


More information about the cypherpunks mailing list