This week, I debuted a series of Jacquard tapestries called the “Hug” series in the Be Mine exhibition at the Greenleaf Art Center in Chicago. The image is of a person on a background with outstretched arms, as if about to embrace someone. The concept for the series is that when one is wrapped in this, the idea is to appear as if that berson is being held by the person woven into the blanket, This is a piece that acknowledges isolation, loneliness, homelessness, and those needing consoling. By wearing the blanket, the idea is that the person at leat knows someone has thought of them.
I promised this for the maker of my Makelangelo 3, Dan Royer, and with a few hours to kill in the Istanbul airport on the way to Transmediale in Berlin, the urge to write comes easily.
So, for a couple months, I have been making color drawings of my cats on the M3. I’ve been using the path algorithm, but I’m sure that will be the case only for 3-4 series, as I have already begun layering different algorithms for their effects.
The question is: color on the M3?
Isn’t it a one pen, single color machine? Of course it is. But the secret to unlocking color on the M3 is to consider what other processes lay down color one at a time. For example, in graduate school, Janet Ballweg taught me an incredible method in which one can use digitally etched solar plate to approximate old-school color litho.
And the secret, of course, is that offset print uses four colors, cyan, magenta, yellow, and black (interesting that it is called “k” because of black having been the key, or registration plate). So, the secret to color on the M3 is to separate the channels of color and lay them downk with the proper color pen to approximate your image. But, as I use the path algorithm, it is going to be stylized, which is my intent.
But how do you go about separating the channels for the robot? That’s pretty easygoing. Go into Photoshop, GIMP, or what have you, and change the image to CMYK, then export the channels as grayscale bitmaps for the robot to use with your given color. Why not use RGB? That is the old additive/subtractive issue, and we’re imaging with ink, not light. Secondly, and this is a fairly big issue, that, if you’re using wandering paths, you know that results may vary. Therefore, I always recommend cranking up the saturation and contrast at least 50% to accentuate the colors and details of your image. I do this on both ends, by cranking up the saturation before export and the contrast on the grayscale channel image after. That way, I get a lot of tonal and hue differentiation.
The rest is easy. Call each channel something like randominternetcat1cyan.jpg and so on, load, generate gcode, and load your pen.
Even though the M3 depends on just toothed belt for registration, it generally works fine. Dan tells me to take the Gcode and combine all four blocks of code with a pause to change each pen, but I haven’t done that yet, as the freezes are rare. Now you know how to do color on an M3.
But a couple notes in passing. If you notice, my Random Internet Cats are very colorful, and if you find one of my source images online, you’ll see I’m not using a realistic color palette at all. That would be boring, right?
Also, why also use CMYK colors for CMYK channels? I’m starting to use fluorescent sharpies for variety, but I still like to use a black channel to ground the image a little. Lastly, why use the same algorithm on each channel? I’ve had really mixed results going this far, but it’s been interesting to see the results. Mainly for now I tend to like layering algorithms monochromatically. Pulse line with a wandering path is pretty cool.
So go nuts. This is art, right? Why stick to reality. I rarely do.
All right. So much has happened since last post, I will break it into chunks. Right now I am sitting in O’Hare Terminal 5 waiting to go to Berlin for my first solo show in Europe (I’ve done group shows, and collective shows with RTMark, The Yes Men, and Second Front.
However, this is the first one with just me. Wolf Lieser has been generous enough to take me on as part of the DAM stable and grant me a show during Transmediale, one of Europe’s premier media art festivals.
What will I be showing? Tapestries, of course, and I will see if I can get them wired for AR. I will be showing tapestries from Allahyari/Unluata’s collective project I was part of, Your Day/My Night, 8 Bits or Less, and some work from Die Wunderkammer. Also showing will be my engraving work based on my late 90’s figurative work. Many videos, from Second Front, The Yes Men, and 8 Bits or Less, Suburban Meditation, and Second Front work.
Earlier on Thursday, I will be showing my seminal work having to do with my resurrection of Slow Scan Television, Slow Scan Subterranean Homesick Blues. Slow Scan Television is a HAM Radio technology used during the Apollo Program that converts a frame of video to audio and transmits that frame every seven seconds at very low resolution. Since one of my specialties is lo-fi, SSTV is perfect. To make it even more lo-fi, I hid under a tapestry and videoed the piece with a Flip Mino to reference early video pioneers like Larry Cuba, shot shot film from a monitor.
The piece is a re-envisioning of the Bob Dylan piece, but re-edited so that every cue card gets 7 seconds so it scans. That stretched it to about 9 seconds, and I used Paul’s Time Stretcher to resample the audio, and it has a great, ethereal quality. Turns out about 9 minutes.
So, for now, “Artifacts” at DAM (a double play on low resolution and the fact that the show is largely made up of artifacts of my exploration in media culture and fabrication), and SSSHSB earlier in the day at Transmediale. I’m very excited, as I’ll see many friends there.
It has been brought to my attention by the artist involved, T. S. Bathurst, AKA Sysperia Poppy, read a possible link between her work and the 2007 ageplay ban on page 4 of my essay, The Translation of Virtual Art, especially that in Leonardo Electronic Journal, Vol. 16, Issue 4-5. In re-reading the essay, I could see this mis-perception being possible in my interpretation of her work dealing with innocent depictions of certain forms of avatars next to the previous paragraph talking about this controversy. Secondly, I have to excuse myself in that in my research, I had unknowingly found erroneous sources, which led me to believe that others such as Zoe Harnell were collaborators instead of her being the sole creator. This occasionally happens, this time to my chagrin.
My intention was to champion Bathurst as a courageous artist in a virtual world and one worthy of mention in my scholarly work.
I regret that my reference to improperly referenced primary sources (web pages now long gone) led to this misunderstanding, and I extend my profound apologies to Ms. Bathurst, her family and colleagues. In the future, I will endeavor to offer a version that states corrected information.
Thank you for your attention and forbearance.
I knew about Scott’s obsession with Marcel Duchamp’s obsession with chess (Duchamp appeared to just go off and play chess for the 20 years he was making The Illuminating Gas). The interesting thing is that he had a custom made chess set that was lost to antiquity. Here is the original set.
However, my friends Bryan Cera (Milwaukee) and Scott Kildall (SanFrancisco) reconstructed the set as a set of objects for downloads on the DIY 3D Printer site, Thingiverse, This project was part of Scott’s residency at the Autodesk labs, and Tom Burtonwood, Pete Prodoehl, Patrick Lichty, Frankie Flood and Michael Ang printed them pout in a variety of materials, with Frankie recasting them in BRONZE (I wanted to, but he had better access to the foundry,)
Another interesting thing is that on a Replicator 2 with PLA, the 89-degree cuts did not “spaghettti/overshoot” too badly. Thanks, guys for the opportunity to be part of this great project. Here is my red set, hope you like it.
Here’s the link to the set, in case you want to print it.
A “darknet” is a network that is accessible by a web-capable device, but not connected to the larger Internet as a whole. What is shown below is my extension on the Occupy.here project as a mobile wi-fi movie server. I got about a 500 foot radius out of it, and am checking on the battery life. Really exciting little project.
This week, Michele Preysler and I finished our first collaboration, a variation on Becky Stern’s Gemma-based earrings from the Adafruit.com site. I modified the programming slightly to generate different patterns, and Michele did the metal and floral work. Next thoughts are a possible fiber optic shawl or large neckpiece using the Adafruit large-scale Neopixel LED rings.