Prelude to a Civilization

Drawing from Google's Quick, Draw! dataset.

December 22, 2021

collection #4574 on fxhash

This collection is inspired by Victor Brauner’s 1954 painting, Prelude to a Civilization. I saw it at the Met’s Surrealism Beyond Borders exhibition this summer. I also adore the child- and dream-like quality of the work of Paul Klee.

What’s a better way to combine all these elements with a Klee-inspired palette, Brauner-inspired layout, and Google’s Quick, Draw! doodle dataset?

I never got a chance as a kid to play and doodle. I studied math and spent way too much time on the computer. But maybe this is all coming in full circle? Maybe, we are creating some new civilization in the generative art space? I hope this brings you a smile and nostalgia for that little kid. Naive maybe, but wide-eyed and full of wonder and hope.

(There is a relatively rare hat tip to SMOLSKULL.)

live link here

collection
collection

Prelude to a Civilization Collection

Inspiration

Doodles are fun and can be surreal. It’s almost the opposite of generative art. I saw this piece earlier this summer:

brauner
brauner

Prelude to a Civilization, 1954, Victor Brauner

It’s whimsical and fun.

There are also debates about what is true intelligence, about neural networks dreaming and hallucinating. This tweet on neutral network running on webgl is cool. And also the amazing Aaron Koblin’s The Sheep Market (2006) — “a collection of 10,000 sheep created by workers on Amazon’s Mechanical Turk. Each worker was paid $.02 (US) to draw a sheep facing left.”

When we teach the models to doodle, what do they see?

Technical Notes

I first tried to use the SketchRNN from ml5.js package. Originally the model is from the Magenta team. It works offline but I can’t seed it to be deterministic to a hash. This doesn’t work with fxhash as it’s setup now because each hash requires generating the same output. So I had to find a workaround. If you think you can figure out a way to configure SketchRNN to be deterministic please reach out.

So instead I wrote a ipython notebook to sample the pre-processed sequence raw data from the crowdsourced dataset. There are seven collections in total. I thought about using word2vec embedding to find clusters in the entire dataset but decided against it last minute because I was too sleepy lol. So I handpicked some of objects and formed as collections.

animals = ["bee", "bird", "butterfly", "camel", "cat", "dragon", "hedgehog", "panda",
           "horse", "kangaroo", "penguin", "tiger", "whale", "dog", "octopus", "sheep"]

geometry = ["circle", "squiggle", "triangle", "square", "hexagon", "octagon"]

weather = ["cloud", "rain", "star", "umbrella", "tornado", "rainbow", "ocean", "hurricane", "sun"]

food = ["apple", "banana", "carrot", "grapes", "lobster", "lollipop", "steak", "pear"]

garden = ["bush", "flower", "grass", "leaf", "tree"]

body = ["arm", "hand", "foot", "mouth", "moustache", "toe", "face", "ear", "eye", "tooth"]

skull = ["skull"]

The skull collection is inspired by the one and only markknol’s SMOLSKULL collection.

I also considered interpolating from one collection to another for a surreal feeling but didn’t implement it because I decided to sample rather than use the sketchRNN.

Then the sampled doodles were randomly drawn to screen with varying scale. I also used a cache to avoid drawing overlapping shapes on top of each other. The palette is drawn from Paul Klee paintings using k-means clustering that I used in a previous work, Inwardness.

Other references: