top of page

AI Case Study

Lion spews out poetry in Trafalgar Square using deep learning for London Design Week

For London Design Festival 2018, Luke Halls Studio in collaboration with LUX Technical, Stage One, creative technologist Ross Goodwin and the Lab at Google Arts & Culture created a 1:1 replica of the lion sculptures at Trafalgar square. The fifth lion that has joined the famous pride has been painted in fluorescent red and invites the crowd to "feed" it with words. With each word, the lion generates a two-line poem using a long short-term memory (LSTM) recurrent neural network (RNN), which has learnt to predict the next text character based on 25 million words of 19th century poetry.

Industry

Consumer Goods And Services

Media And Publishing

Project Overview

"For London Design Festival 2018, a fifth fluorescent red lion joins the famous pride in Trafalgar Square. The new lion will not be silent: it will ROAR.

Everyone is invited to ‘feed the lion’, and this lion only eats words.

Each word the lion digests is the starting point from which a two-line poem is generated, by an algorithm trained on 25 million words of 19th century poetry, the century the lions came to be.

By daylight, the ever-evolving collective poem will light up the mouth of the lion accompanied by a choral roar. By night, the poem will be projection-mapped over the lion and onto Nelson’s Column itself: a beacon of streaming text inviting others to join in and add their voice.

The poetry is generated by an artificial neural network model that has learnt to write by reading millions of words of 19th century poetry.

To run the algorithm, it requires a network of hardware and software communicating each words submitted to the algorithm and then sending the generated poem to the audio and visual outputs. This allows for the artwork to respond in real time to the ever-evolving poem.

To build this Google Arts & Culture worked with collaborators LUX Technical, and installed this system inside the lion.

The build of the replica lion at 1:1 scale was tasked to Stage One. They scanned one of the lions in Trafalgar Square using LiDAR, this was then cut out of 16 35kg foam blocks and then pieced together."

Reported Results

Please Feed The Lions installation at Trafalgar square as part of the London Design Week

Technology

"To generate the poem, a deep learning algorithm is used, which was created by creative technologist Ross Goodwin and the Lab at Google Arts & Culture.

The specific algorithm is known as a long short-term memory (LSTM) recurrent neural network (RNN), and it works by learning to predict the next text character (letter, space, or punctuation) over and over again, always taking into account the text that came before its current prediction. It tries over and over again to predict the next letters in its training corpus, constantly tuning millions of internal weights in the process, until it is able to do so accurately on sequences which it has not seen before.

At this point, arbitrary sequences of text can be fed in (the words submitted by visitors), and the machine will expand these submissions into its own form of poetry."

Function

Operations

General Operations

Background

According to designer Es Devlin: "British design guru, Sir John Sorrell nudged me as we walked through Trafalgar Square this time last year and said: 'Landseer never wanted those lions to look so passive: he proposed a much more animated stance, but Queen Victoria found it too shocking.'"The thought lodged in my mind. What if we could invest the lion with a diversely crowd-sourced single collective poetic voice?”

To create the lion, it required a collaborative approach with different studios working together across sound design, fabrication, programming and projection

Benefits

Data

Millions of words of 19th century poetry

bottom of page