This past week I came up with a writing prompt for Ben. The idea this time is not to replicate the way zebra finches learn to sing, but to pay homage to it. I asked Ben to think of a skill that could be taught to another person, and write instructions for that skill transfer in the second-person imperative voice (you must cut the tomatoes, turn on the fan, filter the coffee through the copper sieve). Once the base text is complete, we’d pick an arbitrary number, n, and change every nth word in the text to something new and random, much like I’ve been doing in my poem prompt. Then Ben will need to course-correct the text, but can only do so by modulating the three words that immediately precede and follow the nth words. The next iteration would be to pick a new number, x, change every xth word, and repeat the course correction. In this way, speech/words become sound analogs, and instead of error-correcting the pitch of a song the way a juvenile bird does, we’re error-correcting meaning.
My poem iterations now seem like a machine had a hand at writing them, which is interesting since we’ve also talked about machine learning and algorithmically generated texts for this project. But to break free from rule-based writing, I started jotting down notes on memory and consciousness for another project. It’s relevant at this point to mention that I have a three-month old baby, and that I’ve been doing a lot of associative thinking about how her brain is developing. I think that every time she wakes from sleep, she’s coming to terms with her existence, which can be overwhelming to handle (if not terrifying) for a creature with poor muscle control and few life experiences. So she wakes up, begins to cry, and calms when I approach because she remembers me. I want to know what memory means for a baby, and how the adults we become are overlaid upon those early memories. I asked Ben if early memories are erased and overwritten like in computer systems, or if the physical structures of those early nerve networks “lignify” the way trees preserve the branching patterns of their past, sapling selves. It turns out neither of these metaphors works well enough to explain what might actually be going on, and so I’ve got some course-corrective notes to work with now.
The other thing we discussed was the nature of consciousness. I’ve still not wrapped my head around all this, since Ben’s given me a lot to think about regarding the lack of vocabulary or conceptual definitions that people can agree on before we decide what is or isn’t conscious. But for me this line of thought begins with open speculation on how much (or little) it takes to create a complete neural network (with input and output structures) in a petri dish. I fell into a research wormhole on brain organoids (pea-sized, self-organized, differentiated neuron bundles made from scratch out of human stem cells) and the ethics of creating and using them. More on that at a later date.