With my expertise I've been collaborating with big companies and known media.
I've presented my projects at music tech festivals and events.
I love to contribute to advertising and creative industries
and I enjoy creating unusual forms of interaction. But one of the things I love most is enhancing the user experience with the latest technologies.
Something feels hard. Is it you? Or me. Why don’t you touch it then? Fondle is an interactive sound installation that manipulates the senses of touch and sound to explore what it means to seek out pleasure. Through the use of recorded tracks and a touch-sensitive interface, the installation makes the familiar act of picking up an avocado sound intensely intimate and feel slightly uncomfortable. Go ahead and give it a hand job.
FaceTheColors is an interactive installation where people use their face as an interface to ‘draw’ their emotions with different palettes of colors on a big video projection. A face recognition system uses the face position and expression of people in front of a camera to draw any “emotion” with different colors ‘emerging’ from each person’s facial expressions.
This video projection works as a mirror to the user drawing an abstract color representation of people’s expression. A colorful experiment to play with and experiment with your face. Each person and expression has their own palette of colors.
The first prototype of this project was developed during the “Interactive computation: artificial vision applied to scenic arts and dance” workshop in Hangar on February 2016 based on a workgroup idea.
Workshop, March 2016
Role: Idea & Development
Technology: Processing, Openframeworks
Controlling analog synths with Tangible Interface
R-Control is a tangible, touchable and collaborative interface and controller to interact in a different way with analog synthesizers and electronic instruments. Conceived for live gigs as centerpiece of the performance. It allows you to control, play and mix any kind of synthesizer that has a MIDI interface. It's hard focused on the interface and thanks to the visual feedback you get from it, it’s easy to understand and create with.
The main idea is to unite the world of musical interaction with the hardware and get the best out of each one, maximizing the performance and interactive sound, allowing each part to perform their specific job, and complement with visual appeal that the interface gives.
This was the final project of the New Interface for Musical Expression Postgraduate in the Music Technology Group.
XMas Dlights was a project made for a spanish radio station, Kiss FM, for the 2014 Christmas campaign. The goal was to change the tedious traditional and well-known Christmas Carols in the Christmas tree with the number one radio hits of the year, specifically the ones who loves more the Kiss FM audience.
With this creative goal in mind we conceived an independent whole product, a little box within a well-designed packaging that would be sent as a gift to some influencers and a little amount of listeners of Kiss FM Radio.
In technological terms It was an ambitious project that was built in short-record time: a standalone radio gadget that plays a special, only created for christmas, streaming service via Wi-Fi and makes blinking the christmas tree lights integrated with the gadget at the playing music rhythm.
To this end we used the Raspberry Pi mini-computer platform with some specific software and hardware made for us to have a good user experience and make an easy step-by-step installation at home.
This project was awarded at Cannes Festival with a Silver Lion.
After the great collaboration with my friend and musician Rodrigo Rammsy with R-Control development we decided to make some new stuff in a different way, exploring the Audio Visual boundaries of live electronic music.
He was invited to make a new live gig in the Barcelona well-known Telenoika Audio Visual Open Creative Community, so it was the opportunity to make a new and experimental Visual Set for his live electronic music set based in hardware synths.
It was my first time working with live visuals, what I decided is to make an approach to this based in some things learned before with R-Control project instead of learning typical VJ software, so I created from scratch a new software for the project able to know what is playing each of the hardware synths of his live set and make a visual interpretation of this. In addition to this I connected a MIDI controller to a bunch of visual filters implemented in the software that allow me to change it with the music dynamics making a totally free live interpretation during the gig.
The software was made using the Openframeworks C++ framework and from then it’s in continuous development. If interested you can check the code on Github repository.
Personal Project, November 2014
Role: Idea & Development
Technology: Openframeworks, OSMC, MIDI
Skateboard as a piece of Interactive Art
Skate Wall is a project born in LOLA Madrid in collaboration with Nomad Skateboards to decorate the walls of the agency in Barcelona. They gave us some recycled boards to create new artistic expressions.
The basis of the work is a digital illustration printed on vinyl. From this, we leverage technology to give more power to the idea, we have installed a small circuit based on Arduino that controls a proximity sensor, red LEDs and a small speaker.
Thanks to the proximity sensor we know if we have someone in front of the table, how far away and for how long. Based on these inputs we can decide what sounds, stored in a mini sound card, can shoot. These sounds are amplified with another small chip that directs them to a loudspeaker. At the same time, we make the red LEDs embedded in the wood, blink.
Personal Project at Lola Mullen Lowe, June 2015
Role: Concept & Development
Technology: Arduino, Audio, Electronics
Mira AV Light Show
Lights vs Pixels, Playing with interactivity
This installation was a Light Show running throughout a DJ set at Mira Festival 2014 in Barcelona and it was made together with all workshop participants “Lights vs Pixels, Playing with interactivity” at the Mira Live Visuals Arts Camp conducted by french interaction design studio Screen Club.
In this workshop we worked on several technology issues and skills to map images, videos and animations to a bulb light matrix of “low resolution”. In first place, from an interaction design perspective, and after, from the technical implementation side.
Finally, the light show was a 8x8 bulb lights matrix controlled with a custom software made for us in Processing. The software allowed us to create several music reactive animations as well as manage the animations with all kind of external MIDI Controllers.
The app that allows kids to see what speed their cars can reach. TrueSpeed detects the speed of a Hot Wheels car and converts it into real scale.
Kids will not only measure the exact speed of a Hot Wheels car in km/h, they will also be able to see what that speed would be if the toy cars were real-size cars. And since we want kids to spend more time in the real world and less in the virtual one, the app also allows them to play with friends, challenging each other to break speed records with their respective cars.
A simple and fun mobile app that uses technology to encourage kids to play one of the most popular real-life games of all time: car racing.
My role here was to study the viability of the application and create a working protoype. After that an independent development studio was in charge of make the whole implementation.
World Cup Race was “phygital” experiment made by Lola Hack Lab that uses an old scalextric game with two cars hacked and connected to Twitter as a form of visualization of the digital world in the real one.
The stream of official "hashflags" implemented by Twitter for supporting the different national teams at the World Cup 2014 was converted in a real race with 2 slot cars powered by those tweets competing against each other, each tweet to each team made the car go further, and the fans who tweeted the most won the race!
The scalectrix track was modified using an Arduino Platform in two ways. On one way, we connected the movement of the race cars in real time with the official country hashflags being tweeted in the match time. Each hashflag mentioned moves the corresponding car which advances a fixed amount of distance on the track. In the other way we could see who is winning the race, we installed a digital lap counter. In addition, you can see the cars race via live streaming with our two strategically placed webcams. Each race started an hour before the match and continued throughout the game.