10442933_10152450958184568_5210532805907505619_n[1]

J.VIEWS’
CRUCIAL GEAR:
—————————————
Cubase 8.5
steinberg cubase 8.5 update

Shure Level-Loc Limiter
171052942494

 —————————————

In any creative field, there is really no way to stick around for long without some level or self-reinvention. Interestingly, NYC electronic musician J.Views (previously known as J.Viewz) found a way to make this process less about “self” and more about others: his new full length ‘401 Days’ was in part created by his fans, who participated in its step-by-step compositional process through commenting, adding sounds and images of their own over the course of 401 days. That sounded so wild, we had to ask J. some questions about it!

What led you to open up your music’s creative process to random people, in ‘401 Days’?

I felt like a big part of the artistic value of my songs is actually in the process of making them. I felt like I wanted to share what happens when I don’t know what’s the song is going to sound like, the curiosity, the experimentation. Those things exist before the song becomes a ‘final product’ and I wanted to share that with people whom I care about, with whom I want to truly connect, and to whom I want to present the full version of my art to.

Where there special rules about how fans could contribute? 

Not really, I only asked fans to chip in in a couple of instances, and the stuff I got was really cool and helpful and creative. I did get a couple of demos from Russian techno producers inside the pile, I guess you can’t have it all…

 

Did this make the process of finalizing a song easier or more complicated?

Both. It made it a bit more straightforward in a way, because I didn’t have time to dwell into what I’m ‘supposed to sound like’, I didn’t have time to really think much about branding and about perfection, I was just making music. So in that sense it was pure joy, creativity, sanity.  But then documenting the whole process like I did was time consuming and happened while I was making the songs. I wanted to really present a visual story that had some coherence with the music, so there was lots of video editing, coloring, and the entire journey was just continuous output.

GEForce

J.Views’ favorite virtual synths: GForce Minimonsta and Imposcar.

How did this experiment affect your music compared to the rest of your repertoire?

It ended up being the most personal piece of work I’ve ever released, by far. opening up the process can be very demanding if you are not being real, I decided to just present the truth as I go, to make the process easy. It resulted in music that feels real to me. Not that the previous material I’d done is not ‘real’, but here I really got to go raw and stay with first takes, and honor the immediacy of everything. I wanted to release relevant songs when they are relevant for me, not a year later…

Tell me about the platform you used (DNA Project) to involve your fans in this experiment.

The platform was created especially for this album-making project by a genius company called Hello Monday. They’ve taken my vision and made it simply better and more beautiful. The DNA Project was the name of this documentation process all the way until I finished the album and called it 401 Days. It showed every moment and every milestone on the way to each song, in real-time.

What DAW did you use to work on the final mixes? 

Cubase.

What synths, controllers and plug ins were particularly inspiring or useful while working on 401 Days?

Outboard I used lots of Shure Level-loc for heavy compression, Orban Optimod compressor, Roland Space Echo, and the main pre-amp was a Neve 1084 through BURL converters. In the box; lots of SoundToys (Decapitator on every channel basically), UAD Plug Ins (LOVE), and Eventide Omnipressor. Minimonsta and Imposcar are my main synths. I semi-mastered everything while tracking and mixing, and eventually Matt Colton mastered the album, possibly the best mastering I’ve had, especially the work he’s done cutting the vinyl, a true craftsman.

Tell us about the video for #almostforgot

It is a heart-controlled music video. The song and video do not use a computerized metronome, the tempo of the song is determined by the listener’s heart-rate in real-time. That’s done when the listener places their finger on the smart-phone camera, and our algorithm detects changes in the skin complexion, and through that analyses the heart-tempo, and locks the song in on it. (there’s a tutorial about that video here)

What about this video from a few years ago?

I used the Maky Maky controller, going into Ableton with another bridging Midi software. Controlling delay time, looping, and reverbs via the Novation SL MK 25.

J.Views own Novation SL MK 25

J.Views own Novation SL MK 25