processing

Interface Fractures IF4Q

Interface Fractures is a series of audiovisual explorations Luka Prinčič has been developing in collaboration with the Slovenian Cinematheque since 2013. The series’ cinema-sound episodes share the same method and format: an immersive situation in the darkness of the cinema, the use of digital and open source tools for generating image and sound, a dichotomy between fixed composition and improvisation in time, a tendency for abstraction out of which fragments of the concrete arise, and playing with the synchronisation of sound and image. Although the creation process includes an examination of the contemporary human condition, conventional narration is not so important in the created performances and remains in the background. Interface Fractures is thus a fragmented excursion into abstract sound and the moving picture. Their synthetic integration emerges from the process of searching for fractures and subjectivities in a seeming impenetrability of polished and polarised interfaces – mediated, inter-machinic, interhuman.

Collaborators

Author: Luka Prinčič

Support

Production: Emanat – (emanat.si/en/production/luka-princic–interface-fractures-if4q)
Co-production: Slovenska kinoteka
Financial support: City Municipality Ljubljana, Ministry of Culture RS

Technical

Technical rider -> https://goo.gl/VAIHtq

Events

, MENT, Pritličje, Ljubljana, SI
, Cirkulacija, Ljubljana, SI
, Slovenska kinoteka, Ljubljana, SI (odpade/cancelled)

Video

Save

Save

Save

Processing and Awesome WM

Processing is a Java application and has had troubles in my Awesome setup for a while now in various ways. With some early versions (2.x) the main IDE window didn’t want to redraw following a resize. Now with new IDE in 3.x series this problem is gone, but now the main output window thinks it has a window decoration and is offset in a very ugly way, showing a wide grey bar at the bottom and slighly less wide one on the right (lack of top window decoration and a scrollbar probably.

These problems can be solved with a program called wmname before starting a java application.

(source: https://awesomewm.org/wiki/Problems_with_Java)
You can find it in the ‘suckless-tools’ package on Ubuntu/Debian.

It seems this needs to be run only once in one of the terminals and it then works accross any subsequent commands in the current session. Probably not stupid to put it in ~/.config/awesome/rc.lua?

The Making of Turns Me On video and EP

The writing below used to be a synopsis for a some kind of vlog progress report about how the Turns Me On video was made. To be honest, I recorded the talk but didn’t like it, but since it actually detailed in a brief way the process I converted it into a blog post.

making of Turns Me On sinopsis

The idea of this post is to give a peek at the behind the scenes ideas and mechanics that lead to the way the Turns Me On video turned out.

At the initial brainstorming the basic problem in that early stage seemed to be this question: does a video for a track about sexual excitement really needs hypersexualized images of women in any way? I really wanted to turn this stereotype on its head somehow and first had the idea to go to clothes shops with a friend, a young male actor, and take skirts, high heels, sexy tights and other women’s clothes, go to changing rooms and have another friend shoot him trying things on. But I was already getting late with finishing the main track, not to mention I wanted to make remixes and so on. So the video was on hold all this time as I was working on music and I felt I need a more flexible plan. And, you know, the need for flexibilty usually translates into kinda do-it-yourself quick guerilla work. At least with me.

Continue reading

Turns Me On EP

turns_me_on_coverThe intro does not leave you hanging: here comes an electro beat, lined with a relentless bassline and supported by a mix-up of samples from a well-known house track and a rhythmic litany by a male voice.

The title track has no refrain, no breakdown, just two long verses that keep on going and going. At first, the lyrics seem to be about those almost “stereotypically sexy” female adornments. The assumption that they are exclaimed with possible sexual aims is soon affirmed, but the lyrics of the second part of the track subverts the arousing into other daily phenomena not so typical of sexual hints.

My desire to talk about a fairly contemporary state of male sexuality – some kind of media programming of a “reptilian” part of the male brain to be sexually triggered with an image – is not so new. Subjective history takes me back to the solo performance called “fiberoptikal” where I wrote a song with verses: “frozen little images / are blocking my sight / making me uptight / they’re like bondage rope”, referring to the sexualized images of female body, including arousing erotic lingerie, specific body parts and their shapes, nail polish, makeup, high heels, specific body poses and movements etc. It was further developed in the performance “Frozen Images” by Wanda & Nova deViator (with Maja Delak), but didn’t make it onto our debut album “Pacification”, so only those that attended the “Frozen Images” concert performance(s) might know about it.

On the other hand, Emanat institute started to develop a new type of feminist and movement-centered burlesque performance. Informed by remix+edit appropriations via electronic dance music and contemporary technological development it was called “Image Snatchers”, a techno-burlesque. In autumn 2014 a workshop with the great performer Ursula Martinez took place, thanks to collaboration with the Ljubljana’s festival City of Women, where the seeds of another take on ‘programming of the male mind’ were planted: a burlesque number called “Turns Me On”. While researching for that piece I subjectively gravitated to a bit of personal history, a popular early house track with a moaning female voice, in my current view another example of voiced embodiment of hyper-sexualized desire, sonic enactment of submission and objectification, in a retrospect a metaphor of manipulation for profit and greed by white male capitalist.

While risking consequences of copyright infringement and a fall to a bootleg obscurity the reference to French Kiss by legendary Lil’ Louis is kept in the clear ear-view – on purpose. It should come as obvious within the context explained above, that this is a quotation, a reference to ponder about and critically reflect the contemporary exploitation of our bodies via frozen, arrested two-dimensional (sound-)images which make us turned on, to which we masturbate, to which we may climax.

Dedicated to all who suffer from erectile dysfunction.

video

playlist

stream/download at Bandcamp:

credits

released August 15, 2016

written, produced and mixed by Nova deViator
vocals by Crucial Pink
mastering by Fred Miller
media & technical support: Radio Študent, Ljubljana

financial support by Emanat Institute and Patreon supporters patreon.com/novadeviator

Thanks to The Feminalz – the legendary Image Snatchers troupe, Emanat, Klub Gromka and Ursula Martinez

Shout-out to Renoise, Processing, SuperCollider, Ardour and Linux Audio communities

available also via Spotify, iTunes, Apple Music, Google Play, Amazon, Rdio, Deezer, Tidal, Microsoft Groove, MediaNet

soundcloud

local data

Processing & multitouch

I wasn’t supposed to be doing this today, but:

Multitouch support on Linux seems much simpler with a different library than SMT that I was unsuccessfully tryin last year. “Simple Touch” actually “just works” via evdev library. The only thing that was needed was to supply the right event device from /dev/input and also make it readable and writable for normal user.

Find out which event device is your touch screen:

Above you see that id of the touch-screen is 11. Which means we have to allow access to /dev/input/event11.

Alternatively (and perhaps much more reliably) one should check the contents of /dev/input/by-id/

You can see that the touch-screen reports to be device “../event5”.

Simple Touch library is available from the Tools menu in Processing, but also here:
https://github.com/gohai/processing-simpletouch/

In Processing there is couple of examples for Simple Touch library, but the essence for a successful start is

1) list devices

will output something like

Touch-screen (event11) has [3] index, so to open this device you finally use

It seems also to be possible to do just this:

and the following also works:

This opens the door for writing multitouch interfaces in Processing, which however means a lot more work. For now it seems a good way to go with writing control GUI in SuperCollider, but eventually, the Processing possibility seems like very interesting and inviting.

So a final complete working example for my machine is this:

processing applet on desired monitor

An example how to control on which monitor does processing applet (sketch output window) appear if you’re using multi-head setup:

IF3 Progress Report #1

With the summer-time, a working-time on my new audio-visual piece, Interface Fractures III, begun. It is now almost confirmed that the date of premiere showing at Slovenian Cinemateque (Slovenska Kinoteka) is most probably 15/september. Since the plan was that we spend some quality sun&salt time at Croatian coast I brough some machinery with me to vacation. It’s always fun to work in the summer heat!

Anyway, with this next episode in the series I want to upgrade technically a “little bit”, so I acquired a better graphic card (Nvidia GTX960) and a multi-touch screen monitor with fullHD 1080p resolution. Adding also a 120GB SSD drive I needed to reinstall operating system (UbuntuStudio 14.4.1), separately compiled drivers for Nvidia and the rest worked pretty much out of the box (after some apt-get-ing). Multi-touch is application-dependent and my idea (for many years now) is to write custom interfaces for live sound/music/noise and visual composition and improvisation.

More technicalities: I compiled Processing and SuperCollider, tried a multi-touch library in Processing (SMT) but it didn’t work. Filed an issue at their GitHub and went on with a version of SuperCollider that supports multi-touch (I was kindly pushed in the right direction by Scott Cazan, who added MT support to his own branch of SC on GitHub). After some basic testing I wrote a simple granulator with a GUI. I also tested a very basic idea of TABs-like interface. In Processing I’ve whet my appetite with an excercise that focused on off-screen rendering and blending two images together.

Processing: slice and blend screeshot

 

Continue reading

Processing: slice and blend

Here’s a little sketch in Processing that does the following: loads an image, takes a horizontal and vertical 1px slice, multiplies each slice into an image off-screen, and blends the two images together and displays the original and blended one side by side. Each frame this is calculated dynamicaly, the slices are determined by the position of the mouse.

Note: the image must be in the folder where your sketch is saved and it must be in dimension of 300×300 pixels.

Processing: slice and blend screeshot

I still need to test this in a fullHD/1080p situation. I wonder if the CPU can take it at 60 frames per second. I actually suspect not. So many pixels and not on the GPU.

programming processing communities

Learning things has always been possible through two main routes (many other are possible though, surely): learning through a reference and learning by example. Personally it’s quite hard for me to learn through reference – it’s like learning grammar and syntax of a foreigh language without speaking the language. There always must be examples of use. Many of them. But who made examples? Others. So, learning from others in an open (free software) world is crucial element of today’s information society IMHO. In the old days I learned HTML from other webpages.

Here are two sites packed with different processing examples to learn from:

SketchPatch: http://www.sketchpatch.net/
OpenProcessing: http://openprocessing.org/