top of page

AI Case Study

Weirdcode creates hallucinatory visuals for video clips using neural networks

Artist Nicky Smith, also known as Weirdcore, has been creating Aphex Twin’s visuals for music videos and shows. For the latter's new single called “T69 Collapse”, Weirdcore leveraged a neural network to create a visually stimulating video clip. The technique, Style Transfer, involves creating images by blending other ones with the use of machine learning.

Industry

Consumer Goods And Services

Entertainment And Sports

Project Overview

"Aphex Twin and Weirdcore dropped another dose of cyberdelic visuals for the new single “T69 Collapse,” off the Collapse EP, due out September 14 on Warp Records. As with the live visuals, Weirdcore’s video explores shapes, colors, and textures through morphing, glitchy effects. But this time, the artist introduces various cityscapes and terrains into the mix, as if the viewer were experiencing a virtual reality collapsing into a black hole.

Weirdcore tells Fast Company that in order to pull off the video’s more refined visuals, he needed to use more advanced render engines, which forced him to switch over to a Windows workstation. His original task was to create something that, aesthetically speaking, lived in between Aphex Twin’s iconic “On” video, which features stop-motion animation on Cornwall, England, beaches, and Autechre’s “Gantz Graf” video, with its morphing 3D-animated machines. But he wanted to do so in a collaged, 3D-scan kind of way.

The video’s intro features lines of text being overtaken by error, some of which almost look like email messages between Aphex Twin and Weirdcore, discussing the music and video concept. Other bits of text range from code to foreign languages, all of which Weirdcore textures and overlays onto photogrammetry scans of Cornish streets and buildings, using After Effects’ UV and Position Pass.

At the 1:04 mark, the buildings and streets take on unreal virtual appearances–moving, oscillating, and flickering with various textures, shapes, and colors. Weirdcore pulled this off using Style Transfer, a technique that uses machine learning to blend two images or videos together to create a unique third (see: Google’s Deep Dream Generator).

“It’s Style Transfer techniques using Transfusion.AI over the Cornish photogrammetry collage,” says Weirdcore. “The original animation actually looked . . . low-end, but using several Style Transfer composites/layers in various ways really made a difference.”

Weirdcore also took a 3D scan of Aphex Twin’s face–warped in familiar visual fashion–and composited it into the landscape that is collapsing into a virtual black hole. Using Aphex Twin’s audio stems, MIDI files, and BPM changing data, Weirdcore was able to sync the visuals to the music, creating an audiovisual experience that really warps the mind when watching and listening."

Reported Results

Video clip created

Technology

"Weirdcore pulled this off using Style Transfer, a technique that uses machine learning to blend two images or videos together to create a unique third."

Function

Background

"For the last few years, musician Aphex Twin’s visuals have been created by an equally elusive and hermetic artist–Nicky Smith, aka Weirdcore. Last summer, for instance, Weirdcore created dark yet hilarious visuals for Aphex Twin’s shows at Primavera Sound and Field Day Festival, which looked like the hallucinations of some corrupted artificial intelligence."

Benefits

Data

Images and video

bottom of page