Algorithmic art, Roman Verostko

October 14, 2009 | Comments

In a bizarre coincidences, I spent a few minutes on Monday googling around generative art and came across Roman Verostko - then on Tuesday morning I received an email alerting me to a lecture he was giving at Lighthouse, a couple of streets away from our offices.

Roman talked about his life, growing up in the Depression and spending 18 years as a monk, his influences (Kandinsky, who saw "visual form analogous to musical form", in that it didn't need to be representation of something other, and Gabo, particularly his Standing Wave piece currently at Tate) and his work.

Psalms in sound and image was one of his first "multimedia" pieces, using two slide projectors and a tape recorder - it was all very "sixties". Verostko then wanted to move from guiding a hand to produce art, to guiding a machine. This led to Magic Hand of Chance which combined algorithmic drawings with generated text, all created using parameterised controls, "high school maths" and extremely carefully selected dictionaries of nouns, adjectives, etc. The output of the text is very Zen-sounding nonsense, a "computer-generated mime of language".

Verostko seems to see machine-driven creativity, where a family of forms can be generated from the same code, as not dissimilar from the way "real" artistry works: the same materials and artist can lead to many different expressions. "All you need is the seed to clone a piece". He writes his own code and uses nothing commercial - expressing a vague embarrassment at one point that he'd used Photoshop for some typesetting and image manipulation in producing a book. He referred to Stonehenge as being a "generative structure", with the inputs being position of the sun, the stones, etc., and maintains that algorithmics is deeply embedded into all art, across all cultures. Analogies were also drawn between code and biology - which is hardly new, but might have been a bit less hackneyed when he first came up with it ;) And as he said beautifully plainly, "every analogy limps".

One view which I noticed he didn't share with the audience was the relationship between screen and printed output. A couple of questions leaned towards asking about his use of a pen plotter - why use such an archaic technology nowadays? His reply was that he was interested in the plotter as an extension of the human arm, and that the patterns which are thrown up by continuous overwriting of lines are both interesting in themselves, and impossible to replicate using inkjet or similar technology. It was interesting to see the (younger) audience assume that the role of a printer is to replicate the contents of a screen, whilst the artist didn't have much use for the screen as a canvas: his plotter-driven work had no "preview mode", the works had to be generated in full and then examined. We were assured this meant lots of BASIC, plain geometry, a disciplined approach, and plenty of bug fixing...

By far my favourite work was the Pearl Park Scriptures, a set of algorithmic drawings (which I think he called "cyberflowers") which combine plotted geometric patterns with either quotations or meaningless text rendered alongside in an alternative alphabet of glyphs.

A fantastic talk from an extremely interesting speaker, and a crowd of folks (art students I suspect) who I'd not come across in my Brighton/digital wanderings. I'll be keeping an eye out for what Lighthouse run next...

Lynx

October 13, 2009 | Comments

  • Stats from dating site OKCupid: "despite what you might’ve heard from the Obama campaign and organic cereal commercials, racism is alive and well"
  • Tom Watson on opening UK postcodes: "I like to think that the liberation of public information in re-usable formats using open standards is an agenda shared between progressives in both Labour and the Liberal Democrats. At the heart of the debate is a profound belief that citizens should have a greater say over the public services they use."
  • Does the iPhone app store actually sell that much?: "This isn't to say that the iPhone App Store is a failure. In fact, I'd argue it's been a huge success in making the iPhone significantly more valuable. But as evidence that there's a huge market out there of people willing to pay for content if it's just packaged up nicely? There's just not enough there to be convincing."
  • Letter from a postie about the current Post Office shenanigans: "The Royal Mail have been fiddling the figures. This is how it is being done."
  • More on generative music, this time making audio from the internal operations of programs;
  • Coping with change on Scrum projects - advice for testers: "With concepts such as TDD and BDD, the testing effort has shifted from a traditionally tail ended process to a front ended process and as a result quality is built in right from the start."
  • How to present whilst people are twittering: "Presenting while people are twittering is challenging. But isn’t it better to get that feedback in real-time when you can do something to retrieve the situation – than wait till you read the evaluation sheets a few days after the conference – and find that you bombed?"
  • Nokia are working on AR language translation: "you know you’re living in the freaking future when you can point your phone at a bunch of alien characters and have them magically make sense"
  • Ghost in the field, lovely visualisations of RFID fields, reminiscent of Dunne & Raby;
  • The ultimate uncluttered tube map: "the whole network distilled into three lines and twelve stops"
  • Simon Maddox has very kindly published the stats from his 0870 app for the iPhone; not sure if it's reasonable, but I found it quite shocking that an app which was downloaded and used to such an extend generated so little in ad revenue ($680.82 in two weeks);
  • Sketching in hardware: "We are coming from years of passiveness to the possibilities of designing and modifying pretty much everything around us"

    Harvest Of The Tabs

    October 08, 2009 | Comments

    • Martin Sorrell, who's probably spent more on paid pitches than anyone reading this, links the practice of unpaid pitching to oversupply in agency-land;
    • Notes from the Teenage Dragons Den panel at OverTheAir; really annoyed I missed this, as it sounded excellent;
    • Amazon have launched their mobile payments service, even including 1-click billing. It's nice to see some much-needed competition for PayForIt and Bango;
    • Symbian have open sourced their UserEmulator tool for automated testing;
    • Caterina Fake on working hard: "So often people are working hard at the wrong thing. Working on the right thing is probably more important than working hard. "
    • 100 impediments a ScrumMaster might run into;
    • A phenomenal light-show projected onto the side of what looks like a stately home, at a "secret festival";
    • Thoughts on the future of the boutique software shop: "I do feel that (if it isn't happening already) UX will become the competitive differentiator for the boutiques". It's happening already;
    • MIT students take space photos for $150;
    • And along similar lines, the project Icarus video of the curvature of the earth;
    • Certified Scrum Developer programme?
    • Hiding data, content and technology in real-world games, from Mr Thorpe - a lovely presentation, lots of gems in here;
    • Avoiding iPhone app rejection from Apple: "If you do find your app is rejected, the best advice I can give is try to remain calm";
    • Device APIs and Policy Working Group, a new W3C effort "to create client-side APIs that enable the development of Web Applications and Web Widgets that interact with devices services such as Calendar, Contacts, Camera, etc."

        Fresh from the presses of the W3C

        October 08, 2009 | Comments

        The Mobile Web Best Practices Working Group, which I form a tiny cog of, has released a couple of Last Call documents:

        These are both open to public review; please do send comments through to the public-bpwg-comments@w3.org public mailing-list

        Project Bluebell

        September 28, 2009 | Comments

        "All the world's a stage, and all the men and women merely players..."
        Last year we acquitted ourselves well at Over The Air, winning Best Overall Prototype for Octobastard, a many-limbed pile of technology we had loving nailed together in an all-night frenzy. Octobastard was many things, but immediately after the event I caught myself thinking that next time around we ought to try for something less obviously overengineered to the gills...

        ...so this year, I had strong urges for us to produce something not just clever, but beautiful. I wanted to do something huge and participative - mainly because I've been thinking a lot recently about helping people feel they're a meaningful part of something bigger. Smule do this, I think our Ghost Detector did it, Burning Man does it... it's a bit of a theme for me right now. And I read recently about David Byrne's audio installation at the RoundHouse, which filled me with wow.

        So in the week before the event I was chatting to Thom and James (FPers who also attended OTA) and we got some ideas together. We work with applications every day at work and we know how tricky it can be to get a large number of people to install and run any piece of software on their phones in a short period of time, so we wanted to avoid that... which led us back to the fundamentals of radio. Could we do something along the lines of detecting radio signals (GSM, Wi-Fi or Bluetooth) and perhaps turn them into something interesting? Yes, it turns out we could. In the end we settled on Bluetooth, because every phone has it and we can access it from software we write for phones - and Project Bluebell took form.

        The idea was to turn the whole audience of the event into unwitting musicians, and have them create a Son et Lumiere with their very presence.

        To do this, we wrote a piece of custom J2ME software which we installed onto 4 commodity phones (cheapo Sony Ericsson devices with PAYG SIMs and a fiver of credit apiece). We hid these phones in the corners of the auditorium. This software scanned for Bluetooth devices nearby, and recorded their names, as well as whether they were mobile phones or something else (e.g. laptops, which were quite common at the event, and which we wanted to ignore).

        Project BluebellThese four receivers then reported the phones near them to a server, every 10 seconds or so. The server, a web application run inside Google App Engine, received all these reports and stored them in a database. It then exposed a list of the most recently seen phones, together with their location (which receiver had picked them up), and exposed this through a simple API over HTTP.

        Two pieces of software consumed data through this API: firstly, an audio processor which turned the Bluetooth names of phones into simple tunes (simply by analysing characters in the names and decomposing them into notes, note lengths and delays) and mixed the tunes together, using a library called JFugue.

        And secondly, a visualiser written in Processing which took the same data and used it as the input for several games of Conway's Life, which each run simultaneously in slightly different shades of blue atop one another. Periodically this visualiser would refresh and dump some new cells into each game, depending on phones found since the last check - ensuring that the animation was continually running but seeded with real data coming from the room. We'd also occasionally show some of the device names we'd found, which we felt might help build more of a connection between folks on the floor and the light-show.

        "...Full of strange oaths and bearded like the pard,
        Jealous in honour, sudden and quick in quarrel..."

        All the above was produced between 11am on the Friday and around 4:30am Saturday morning: you can do a lot with a small number of smart caffeinated geeks, particularly if you're comfortable with their becoming vaguely hysterical in the process. Who convinced us that the glowsticks were a hotline to Google? Why did we start extracting new software development methodologies from Deal or No Deal? What exactly *did* we do to that bear?

        How would we improve the product? I'd personally like to see a stronger link between individuals and output - so that as a participant I can see myself and the difference I'm making. I think it's possible to work this out right now by looking and listening carefully, but it could be made more obvious.

        I was surprised at how well the music side of it worked (generating anything that sounds like music from what is effectively random and varying data was one of the tougher challenges). I'd like to see if we could vary the music further and perhaps add samples. I'd also like to set this up and run it again somewhere, to see how it works over a long period of time with a smaller, faster-moving audience.

        Adam Cohen-Rose has kindly put some video of the second demonstration online, where you can see the Life and hear a little of the music. I'm going to try and get a better quality video of the two together over the next few days, I'll post this up when I have it, perhaps with some audio for iTunes :)

        We also had a team who were covering the event for the BBC get quite interested, and I understand we may get a mention in an upcoming episode of Click.

        "Last scene of all,
        That ends this strange eventful history,
        Is second childishness and mere oblivion..."

        We were absolutely maxi-chuffed when Project Bluebell was awarded Best of Show by the panel of judges. Thank you :) Only a year to prepare for the next one, now...

        Update: Project Bluebell was covered by BBC news; we also appear in the video clip bundled into this article.