TouchUpdate – NewYork Premiere

I’ve been on the road for a few days with the TouchUpdate team preparing for our show premiere at New York Live Arts.  This past week has been fantastic with our two showings at DancePlace in D.C. Their communications coordinator Amanda even made a video for Instagram TV.

https://www.instagram.com/tv/BqAY_psnmSZ

The project has come a long way from when I first worked on the Projection Mapping Masks in 2015.  When I get a chance, I’ll be writing about the creation of the masks and video sculpture.  Until then, here’s a brief of the show.

Bill Shannon premieres Touch Update, a new work exploring contemporary modes of digital versus interpersonal representation and physicality. Shannon combines movement, wearable projection technology, and video installation to explore the social constructions that surround disabled bodies. His Touch Update features collaborators and performers Ron Chunn, Teena Marie Custer, Raphael Botelho Nepomuceno, Jacquea Mae, David Whitewolf, Cornelius Henke III, Anna Thompson, and Taylor Knight.

 

If you’re in NYC come to the show. Here’s the info.

New York Live Arts, 219 W 19th St. New York NY 10011 (between 7th and 8th Ave)

Closest to the 1/2/3, A/C/E and L

Tickets: Start at $15

Wednesday, November 14 at 7:30pm

Thursday, November 15 at 7:30pm

Friday, November 16 at 7:30pm

Saturday, November 17 at 7:30 pm

View this post on Instagram

Come see the show in DC or NYC. #Repost @whatiswhatdotcom (@get_repost) ・・・ #touchupdate Rehearsal Run recorded @paintedbrideartcenter The show is headed next to @danceplacedc for two shows November 10th and 11th. Rehearsal performances here by @rapha_nepomuceno on crutches wearing facial performance for video mask by Ron Chunn and @slowdanger__ wearing facial performance for video mask by @teemarievft Video Editing /Mapping & RPi Programming @projectileobjects Mask concept and fabrication, script, production design @whatiswhatdotcom sound design @slowdanger__ The Pittsburgh shooter said “screw your optics, I’m going in” This piece was created in part as a response to those optics of suffering and the ongoing decline of the state of the world on a myriad of levels. Over my lifetime I have repeatedly turned to dance and music to heal, to escape and to transcend. If you miss DC please check us in NYC @nylivearts November 14th - 17th . . . . . . . . . #antihate #antifascist #billshannon #videoart #wearables #videoartist #interdisciplinaryart #interdisciplinaryartist #performanceart #dance #disability #ProjectileObjects

A post shared by Cornelius (@projectileobjects) on

This performance of Bill Shannon is made possible by the New England Foundation for the Arts’ National Dance Project, with lead funding from the Doris Duke Charitable Foundation and The Andrew W. Mellon Foundation. Bill Shannon’s Touch Update is a National Performance Network / Visual Artists Network (NPN / VAN) Creation and Development Fund Project co-commissioned by Kelly Strayhorn Theater in partnership with Painted Bride Art Center, Dance Place, and NPN. For more information: npnweb.org.

large light ring projectileobjects game lighting photography cinematography

slowdanger crowdfunding.

I’ve had the honor of working with this performance entity, slowdanger. They have launched a crowd funding campaign to support their 2018 Fall performance season and if you have interest in their work, please share or contribute to their campaign.   Thanks

https://www.indiegogo.com/project/slowdanger-2018-performance-season/embedded

1510548262588

slowdanger x ProjectileObjects collaborating for BedStock 2018

Critically acclaimed performance entity, slowdanger, embarks on their 2018 Fall Performance Season.

Source: slowdanger 2018 Performance Season

Projection Mapping with the Raspberry Pi

They’re compact, affordable, and powerful enough to handle a wide range of projects. In this article, I’ll go over some Pi-powered options and simple setup. Whether you’re a projection mapping beginner or pro, the RPi may be perfect for your next project.

As a disclaimer to this article. Basic Linux commands and knowledge are recommended for anyone looking to work with a Raspberry Pi. The exception to this, the minimad, integrates with MadMapper’s software and is easier to set up. PocketVJ runs off of ofxPiMapper and features a web portal for control, but ofxPiMapper by itself will take a little bit of research to become proficient at it. I’ve been using Raspberry Pi’s for installations and various projects for years now.  My longest running Pi install has been going on for 3 years now, and I have yet to replace anything more than a microSD card. This is not a guarantee that the Pi is perfect for you, but it should come as a reassurance that they’re built to last.

When it comes to projection mapping with the Raspberry Pi, a few options come to mind. Madmapper’s miniMAD, ofxPiMapper, and the all-in-one PocketVJ. All three are excellent choices, and between each of them, there are some pro’s and con’s that you’ll want to take into consideration.


ofxPiMapperLogo

If you haven’t tried it already, open frameworks PiMapper, simply works. You’ll want to plug a keyboard and mouse into the device to control it, but it boasts a wide feature load and continual developer support.

PRO’s

  • Price (Free)
  • Open source
  • Circle Surface
  • Grid Warping

CON’s

  • Small learning curve
  • No Sync

miniMAD

miniMADblack2

I’ve documented how easy it is to get up and running with the miniMAD here: but with its price point of $220 + a mad mapper license ($420 own, $42 / month rent), it may stretch your budget further than you’d like. Currently, the considerable advantage that the miniMAD has over all other RPi mappers that you can Sync a lot of them over ethernet for multi-mapped installations. It uses Madmapper (Mac/PC) for setup, which makes it the most natural RPi projection mapper that I’ve tested. Here’s a gif from the developer of 28 miniMADs in sync.

28miniMADsync4.gif

PRO’s

  • Intuitive setup
  • Video sync
  • DMX lighting control
  • OSC control w/ TouchOSC layouts
  • GPIO buttons

CON’s

  • Limited customization (not open source)
  • No Wifi
  • Price / Requires MadMapper (not a terrible thing!)

PocketVJ

3-pocketvj-contents-1-560x3141

PocketVJ 3.5 is a portable multimedia tool based off of the Rasberry Pi platform. It hosts a wide feature set that easily accessed on a Phone or computer through a custom web control panel (CP). Unlike the miniMAD, the PVJ supports wifi connections and video player sync. But like ofxPiMapper (which the PVJ uses), you cannot sync mapped content. An easy way around this is to record the mapped output from a computer. Load the pre-rendered videos back onto the Pi and play them in sync. Since the PVJ has Pi Wall and a sync’ed video player, you string them together for a multi-projector installation. Not to mention, you can run custom scripts through the web CP and automate your installations.

PRO’s

  • Open source
  • Web control panel (Access from phone or computer)
  • Video sync over Ethernet or Wifi (not with mapping)
  • Projector remote control & scheduling
  • Screen Sharing
  • VJing
  • Pi 3 compatible
  • Image view
  • FTP Browser
  • Image player
  • TCPSyphon
  • OLA, DMX, & QLC+ support.
  • Customizable
  • Video Sync*
  • Pi Camera supported
  • OSC control (Latest version 8.15.2018)
  • Tutorial videos and documentation

CON’s

  • No video sync with mapping.
  • Learning curve.

PocketVJ

If you have a Raspberry Pi 2 or 3 laying around, you can go here, follow the instructions and build a pocketVJ from scratch or support the developers by buying a PVJ from Switzerland. If you do it yourself, make sure to update the CP (control panel) to the latest version: here follow the instructions on the page, and you’re up to date.

miniMAD

If you want a miniMAD, you’ll have to buy one from Madmapper. Don’t forget to buy or rent a MadMapper license if you don’t already have one.

USA customers: https://shop.blinkinlabs.com/products/minimad

Everywhere Else: http://shop.garagecube.com/minimad


 

ofxPiMapper & install

Here’s one you can set up now.

Go to https://ofxpimapper.com/ and download the disk image. (It should end in a .img.zip):

Support the developers 🙂 if you’d like.

Download and install Etcher (Mac/PC/Linux).

Insert a MicroSD card into your computer, open Etcher (it should auto-select the drive) select the PiMapper image, and click Flash.

projectileobjects pi mapper install

Once it’s done. Insert the microSD card into your pi. Plug in a USB mouse &/or keyboard. Power up the pi with it plugged into your projector. (optional, connect it to a network via Ethernet for additional features and control).

The Pi should boot up and present you with an example Triangle.

02 tutorial ofxpimapper1_4

Using 1, 2, 3, 4 to switch between the different modes, you can manipulate, and add layers, change the source video, and other adjustments using the keyboard. I recommend a “wired” USB mouse as I’ve run into issues with wireless mice movement being choppy. Experiment with it, and hit the “i” key if you want keyboard command info on the screen.

03 tutorial ofxpimapper2_4

If you want to access the Raspberry Pi command prompt, hit the “Esc” key and find yourself at a screen that says opm login: The username is “pi” and the password is “raspberry.” If you’re stuck at this menu. Log in and, type in “sudo reboot now” this will restart the pi and get you back to the example of ofPiMapper.

If you want to use your media plug in a USB thumb drive with the photos and videos (.mp4) and reboot the Pi. It should detect the files on the thumb drive for use in your new layers. (see the ofxPiMapper documentation for file types and troubleshooting).

04 tutorial ofxpimapper3_4-1

ofxPiMapper is not as easy as the PocketVJ’s web FTP & control panel or Madmapper’s export feature to the miniMAD, but if have a Pi and 30 minutes, this basic ofx mapping example is good enough to get you started.


If you like what you see thank the developers by supporting their projects.

Support PocketVJ with a purchase or by donating to via PayPal info@magdesign.ch

Similarly, PiMapper has multiple donation options on their website: https://ofxpimapper.com/

 

miniMAD gets Wifi for TouchOSC

I’ve been doing a lot lately with the new implementation of OSC with the PocketVJ (www.PocketVJ.com).  It even works great with the Raspberry Pi 3’s built-in wifi so I can use my phone to control the pocketVJ when it’s hung inconveniently high up with the projector. 🙂

That being said, before I knew about the PocketVJ I was playing around with Madmapper’s miniMAD.  They’ve made a lot of changes to it since its release, but it still doesn’t offer Pi 3 support or WIFI…  And that’s a bummer.  They did implement OSC, so I figured, why not give the miniMAD wifi of its own for ~$20.

madmapper minimad wifi raspberry pi 2 glinet

And the best part is that this little wifi adapter runs off of a 5V 1A micro USB cable That I can plug directly into the RPi.  I’m using the GL.iNet GL-MT300N-V2 it’s a little bit overkill for this, but works great out of the box.  Since MadMapper doesn’t have any immediate plans to give the miniMAD wifi, this little box will do the trick.

I did have one issue with the miniMAD giving TouchOSC an incorrect IP address, but I corrected it by logging into the back end of the small router and pulling it’s correct 192.168.3.27 IP Address.

 

miniMAD wifi

miniMAD IP address was not correct. I had to enter it manually.

 

Then I had playback control over the miniMAD through the TouchOSC app (iOS/Android)

You could also send network commands from other OSC apps such as QLab by running these Outgoing / Incoming port settings and executing commands as simple as ‘/pause’ or `/play.`

I’ll have more about the PocketVJ TouchOSC layout as soon as I get some more time to hack it together. Here’s a preview of what it will look like 😉

 

 

Three inspirational A/V sets

It’s been a while since I posted to this blog, but after a night out, I felt inspired by some awesome artists that live in the city I call home. In VJing is there is no “one way” or “right way” to do anything. It’s an openly expressive medium that offers up incredible opportunities for collaboration and creation.

On June 22nd, 2018, {arsonist}, Char Stiles, Spednar, REW, and Jeremy Bible took over Pittsburgh’s Mattress Factory for an evening of immersive sights, sounds, and a sort of contemplated musical-emotional therapy. The place was packed and entranced during each of the 3 sets. In this article, I reached out to all of them to explain their process and workflow. I’ll let the videos speak for themselves.

 

Set 1:  {arsonist} + Char Stiles -“eCosystem”

Mattress Factory Arsonist Char Stiles 2018

“Pittsburgh’s {arsonist} (Danielle Rager) and Char Stiles use live coded audio and visuals to create simulated ecologies. Musical phrases dynamically constrain rules of automaton interactions, leading to the genesis and destruction of synthesized life forms and their sounds. The symbiotic exchanges of the virtual organisms mirror the macro system of feedback between the model and the humans generating the collaborative A/V experience.”

Mattress Factory Arsonist Char Stiles 2018

{arsonist}

 “My music was a combination of my live electric violin playing, Ableton Live, and live coding language TidalCycles (https://tidalcycles.org/) which allows you to do very complex/intricate sequencing of audio samples using the concise syntax of a functional programming language that is built atop Haskell. I internally route the audio of the music I create in TidalCycles to Ableton Live using Jack (http://jackaudio.org/). Together Jack and a TidalCycles “parameter context” called “orbits” allow me to “stem” the music I make in Tidal out to Live in realtime, that is, have one channel input channel in Live for every channel of sound created in Tidal.  I make sure that my audio between Ableton Live and TidalCycles is synched to the same clock using TidaLink.

The show at the Mattress Factory was using a special 8 channel sound system. So my audio output from Ableton Live to 8 channels of outputs on my audio interface and then to the speaker system. I was able to use Ableton Live to program audio pans on the 8 channel system, or direct certain sounds to certain speakers. @CharStiles received a copy of my master audio output, which controlled parameters in her live coded visuals. She specifically frequency binned my audio input to get the gain for a few frequency bands and use these to make her visuals audio-reactive, as well as the MIDI note and duration information that I sent her from every track’s lead synth line. She can tell you more about her own live coding visual process and how she unpacked and used my audio and MIDI signals.” -Danielle

Char Stiles

Char was running Kodelife from two different laptops into a Roland HDMI mixer and out to the projector. Since Kodelife is a scene based live coding application it doesn’t offer smooth transitions from one scene to the next.  That’s where the HDMI mixer comes in, the audience didn’t have to bear through seeing the Finder interface.

She had audio, and midi input from Danielle to make the organisms dance to the music. Two parameters from Danielle were bound to variables in the shader: the frequency (certain part which was picked out by the binning interface in Kodelife), and the midi notes from the main “voice”.

A quick word on the content- she implemented a version of Conway’s Game of Life called smoothlife (https://arxiv.org/pdf/1111.1567.pdf) in one shader pass then used that as the texture for the corner to be a lookup for the ray-marched organisms in the second pass. She’ll soon have a tutorial on how she implemented ray marching in Kodelife.

Here’s a promo from their set together:

 

Set 2: Spednar + REW – “Computational Chiaroscuro”

Mattress Factory Rew Spednar 2018

Pittsburgh’s Kevin Bednar and Rachel Wagner have created an audio/visual piece which entwines minimalism and maximalism through a monochromatic lens. The performance presents fleeting and evasive visual states informed by shifting rhythmic structures. Composed using Python, vvvv, Processing, and Resolume – the visuals augment an intentionally absent aural pallet to complete a multisensory experience.

Mattress Factory Rew Spednar 2018

Spednar

Spednar used tidalCycles to code the audio live, Dirt to send the audio to 8 output channels, and jackmixer as a graphical mixing tool to midi map the 8 channel levels. To explain this in detail, he compiled a github repo about the process https://github.com/kbdnr/Tidal-multichannel

REW

Rew: “For the Computational Chiaroscuro visuals, there are two types of videos that overlap. The first type are the shape layers, created by Rew in After Effects, and these act as masks for the second type of videos, which were created by Spednar. To make these, he programmed random alphanumeric characters to appear and scroll across the computer screen and they were screen recorded. Rew’s set up to perform with these visuals includes her computer with a GeForce GTX 1080 graphics card and an AMD RYZEN 7 1700 8-Core CPU, the software Resolume Avenue 6, an APC20 midi controller, a Korg nanoKONTROL2 midi controller, and a Focusrite Scarlett 2i4 audio interface. The audio interface is what Spednar and Rew use to get the visuals to react to the audio; Spednar’s audio signal is sent through it to Rew’s computer, then Resolume Avenue uses the audio FFT from it. Additionally, the artists have the ability to record the visuals + audio straight from Rew’s computer using an AVerMedia 2 capture device and an HDMI splitter.”

Here’s a snippet from their performance:

 

Set 3: Jeremy Bible – “Human Savagery”

Jeremy Bible Human Savagery Mattress Factory 2018

Human Savagery (in 8.2 channel surround sound) pairs Bible’s powerful & moving, symphonic ambient compositions with his vivid ultra-HD aerial cinematography – which contrast the alien beauty of untouched mountains and desert landscapes with the chemical violence of EPA Superfund sites ravaged by industry. Bible has visited sites across the United States to create this footage, capturing a gripping snapshot of humanity’s often unseen footprint on this planet.

Jeremy Bible is an American transdisciplinary artist focused on the intersection of sound and light. Taking cues from his background in field recording and photography, his work occupies a shared space between multi-channel audio-visual installation and compositional practice. Utilizing acousmatic sonic diffusion techniques, Bible strives to construct an environmental experience that is at turns profoundly sensory and beautifully surreal.

Mattress Factory 2018 Drone AV performance

http://jeremybible.com/humansavagery

VJing Projection Mapping Mac & PC live coding VDMX Madmapper.

 

I thought it would be a good idea to document the process used for Cosmic Sound’s Altared event w/  Bary Center, J Butler, To Sleep At Night, Dilettante. It wasn’t my first time using a multiple projector/computer setup, but adding a PC and Live coding  VJ into the mix needed a new workflow. There’s a video from the show below and if you’re interested in the process of sharing live VJing content across computers and operating systems read the entire post.

Here are some short video clips from Altared I co-VJ’ed with the awesome Char Stiles.

GLSL, Live coding VJing, Mac & PC, projection mapping, VDMX, Mad Mapper, Kodelife, Lumen App, Spout, Syphon, UVC HDMI Capture, four channel HDMI mixers, and fun!

That statement above says it all. Now I’ll go into the details.

As Char writes on her blog:  ( http://charstiles.com/performing-coding/   ). We were using a Windows 10 Laptop running Madmapper and Kodelife. While there is little documentation on Kodelife, it is very intuitive if you understand the world of GLSL and live creative coding.  Simply put, GLSL is a shader language that runs directly on the GPU. Skip the 4-hour blender or After effects rendering queue and process visuals in real time. Change a few lines of code and tweak them as you go.  This style of VJing offers some big live performance pros and cons. And as time goes on, it will only get better.

 

Whether you know it or not, a good number of VJ and projection mapping applications have been making use of GLSL shaders for years.  VDMX, for instance, has their brand of shader called and ISF. Modul8, Madmapper, Resolume, etc. The pro’s of GLSL is in the performance possibilities.  Here’s a youtube video that rendered out in real time and its executable file size is tiny in comparison to the capture video size. The downside is you can change a line of code and end up with a VJ blank screen of death (VBSOD).

Char and I wanted to collaborate, and what better way then for both of us to share our VJ feeds with one another.  I get her’s, and she’ll have mine. Allowing both of us to do our own thing or tweak each other’s feed. This process led to some visceral results but took a few trials runs to get it all figured out.  Our setup looked something like this:

Using a combination of NDI, Syphon, Spout, and a USB 3.0 HDMI capture device we could share our feeds, loop them, and mix/blend/mash all of the above.

 

Here was one hurdle we had to tackle.  Kodelife -> to Spout -> to NDI. Kodelife works with Spout (PC) & Syphon (Mac) natively which is immensely helpful for sharing its real-time graphics output on the same machine.  But when you want to share it with another computer, two options are an Ethernet cable or an HDMI capture box. (I’ve written about these boxes here: link ) For this, we did both, but either or would have worked.  The problem was that the Spout to NDI .exe application needs reset when you open up another Kodelife shader ( or scene ). My MacBook Pro with VDMX & Madmapper could easily import the feed from Char’s PC running Kodelife, but every time the scene would change, the NDI connection would get dropped.  This is where the magic of Madmapper came in. Madmapper imports spout & Syphon natively. So when Kodelife is running, madmapper see’s Kodelife’s Spout or Syphon output. Madmapper also has NDI send and receiver built-in. And even better, 1 Madmapper license is good for up to two machines. By using madmapper -> NDI  over ethernet to my Mac also running Madmapper, we were able to workaround the issue of the Spout to NDI windows program reset. Madmapper sends a signal as long as the application is open.

 

So Char’s Kodelife -> Spout -> Madmapper -> NDI -> over an ethernet cable -> to my Macbook pro running Madmapper -> VDMX -> back to my Madmapper -> out to a projector, was a huge chunk of our setup.   While this may seem congested, there was little to no latency from this process. From laptop to laptop you can use a single CAT6 ( CAT5 will work ) to share visuals back and forth. We also used my Roland 4 input HDMI mixer as a hardware device that could select either of our feeds and override the output to the projector.  So if Char wanted to run the show, she could slide a lever and take control. Vice versa, but this also gives us the ability to work with more VJ’s or bring in live camera feeds, feedback loops, etc. (Enter -> Altared II on 3.31.2018 🙂 Char, ProjectileObjects, and Nate Cake will all be VJing together using this setup. By adding a router or switch into the mix, Nate will be able to share his NDI output from Modul8, the VJ software that he likes to use.

 

It’s a blessing to be able to collaborate with such remarkable individuals.  Thankfully the technology continues to evolve to make that more possible. Thanks to Char Stiles, Hexler.net <- the makers of Kodelife. Madmapper, VDMX, and all the individuals who made this whole process possible.

 

large led light ring projectileobjects led ring slowdanger

SlowDanger LED Ring Moving Shadows

I’ve known the slowdanger duo for some years now. During a recent residency with them and Bill Shannon. We had time to talk about a future collab. I sent them this video of moving shadows. (gif now).

They liked the idea and wanted it to be portable for touring. I tossed around some ideas but a series of spinning lights seemed impractical. I had some RGB LED strips laying around and after a few night experiments that are too embarrassing to post, I built a giant ring to test.  The 50′ circumference ring breaks down into 10x 5′ segments with quick connectors.  The whole kit can fit into the trunk of a car and can run off a Raspberry Pi and pixel controller.

The first day I got it up and running, I was almost trampled over by the office dog.

Our original launch date was sometime in the spring, but a show came up at the Wood Street galleries, and they asked if I could slap it together in time. It’s still a work in progress, but one that is fun to play with. Similar to the theater, it looks even better in person and hard to capture on camera. Take a peek; we hope to do more with it soon!

Videos, Blog, and LED installation Copyright © ProjectileObjects 2018

Best way to capture and share VJ content?

What’s the best way to capture VJing live?

I put this article together to share some of the best ways I’ve found over the years to capture an HDMI signal from a computer.  Additionally, it also covers ways to share visuals from one computer to another so that two or more VJ’s can work together.

This article is broken up into two sections. Standalone capture boxes and UVC devices that require a computer.  If you’re only interested in sharing VJ content from one computer to the next, skip to UVC devices.

If you’re familiar with HDCP, screen resolutions, and Syphon/Spout; feel free to skip ahead. I won’t go into the details about color encoding /spacing or codec compression, but I’ve included links at the bottom of this article.

HDCP and the pain it can cause  

Most computers, MAC’s notoriously, will sometimes require an HDCP handshake before they send a video signal. This digital copyright protection protocol is implemented to keep you from bootlegging content.   Unfortunately, even if you’re sending out a VJ signal you own, there is no ‘legal’ way to disable HDCP within the software or hardware of your computer;  But, you can bypass it…

A VGA cable is the most “cost-effective” way to dodge HDMI, but a VGA capture device can be harder and more expensive to find. :/

An HDMI splitter is another possible way, but not all are the same – – And later you may find out that some playback devices won’t work at all if they do not detect HDCP. I don’t know too many VJ’s using an Apple TV or Amazon Fire TV stick, but it can happen. For this, there’s better ways than splitting or converting the signal.

My first foray into the world of capture devices took me on a collision course with HDCP. Nothing I tried would work, and the only thing that worked was expensive and problematic. In 2012, I landed on a Blackmagic HyperDeck Shuttle SSD recorder. It had a battery, captured HDMI or SDI, and didn’t have a problem with my MacBook pro’s HDCP connection. I hit record, and…  the uncompressed files were MASSIVE, the SSD drives were expensive, and only a few SSD chipsets were compatible. VJing on my laptop was enough to cook an egg, so using built-in software capture was out of the question. I needed all the GPU and CPU power that I could get.

 

“What I was look for?”           

HDMI in & out, HD recording 1080p30 or higher, and low to no latency. I tested dozens of “streaming boxes/capture devices” that would add as much as a 2 second delay. For me VJing is about timing and synesthesia and a 2-second delay is not going to work…

Standalone Capture devices:

vj recorder box capture device VJing Projectileobjects

All the devices from Cloner-Alliance work and spoof HDCP. If you do a quick google search for HDMI USB capture, you will find boxes that look almost identical to cloner alliance’s, and guess what, they are. I’ve even tested cloner alliance’s firmware updates on the Evolve look-alike box that I bought off Ebay. (Update firmware at your own risk, I cannot promise that all internal hardware will be compatible). I will say avoid their Flint LX box at all costs. It has a 720p image processor inside that adds a sharpening filter to cook out a 1080p signal. It is not a real 1080p image and even adds a terrible amount of latency.

Most of these boxes are simple, insert a thumb drive (or TF card = microSD), plug in your HDMI source and connect it to a projector or monitor. Press a single button to start and stop recording. It should take a few seconds to clean up the .mp4 file, but that’s it. You now have an H.264 video ready for sharing, streaming, or playback.* ( H.264 .MP4 is not an optimized codec for live visuals, look into HAP if your VJ software supports it or DXV for Resolume.)

Avermedia Portable gamer will not work with HDCP signals. (Sorry Kevin for not warning you sooner.) THANKS to a comment from That Fuzzy Bastard after posting this article. He uses Thunderbolt to DVI, then puts a Cable Unlimited DVI to HDMI (AUD-2362) box between the computer and the Avermedia. (see in comments).

 

UVC and Thunderbolt Devices:

magewell capture usb blackmagic thunderbolt capture vj mutliple vjing capture device

These devices require a computer to capture or “share” the content from one VJ to the next. Most free up CPU power by handling the image processing on board. Intern, your computer treats it like a USB 3.0 webcam and pulls in the signal with almost no latency. There are PCIe versions of this, but I VJ with laptops. A majority of UVC devices are designed for streaming or compression. Some aim to produce an optimized video streaming or compress video for smaller file sizes. In doing this, it takes more time for the onboard processors and adds video latency (or delay). In my tests, there were only three devices that met my needs for low latency. The best by far was a MAGEWELL USB capture HDMI Plus which can handle a 2k signal. MAGEWELL seems to offer the highest quality for UVC capture devices but at a $$$ more expensive price point. ($359 at the time of this post). The best overall for price and performance that I found is the iEi HDB-301R-R10. ($149.99), despite the iEi specifications saying that is only works with 10.13+, it worked perfectly on OSX 10.12 and Windows 10 without additional drivers. The cheapest is a generic capture card that someone else has already reviewed in greater detail.

I’m still waiting to test cloner alliance Flint LXT, MOKOSE USB3.0 HDMI, and MOKOSE HDMI.

The Elgato game capture HD60 S was a terrible disappointment. It has a small form factor, competitive price, and low latency, but it doesn’t work with anything outside of Elgato’s proprietary software center. Likewise; these didn’t work: https://www.amazon.com/gp/product/B00PC5HUA6/ nor this https://www.amazon.com/gp/product/B06XWL7SZD/

🙂 Thunderbolt:

I have a love-hate relationship with Thunderbolt, most of it hate, but the low price point of the BlackMagic Ultrastudio mini recorder was pretty hard to beat $149 (+ price of Thunderbolt cable), and it was able to transport an HD video signal from one computer to the next with low latency. It requires Thunderbolt 1 or 2, which can be limiting for PC users. Designed for Broadcast and TV standard resolutions, not computer resolutions! The max resolution the BMD mini recorder supports is 1920x1080i60 ← You may have seen something like 1080p or 1280x720p60, or 59.94i? The “i” is interlaced, and if there’s a “p,” then it’s progressive.

Think of the “p” as a full picture so that p60 would be 60 pictures per second, p30 = 30. TV’s are changing, but they have long since run at 29.97fps or 59.94 interlaced fps. Here’s a helpful photo.

interlaced vs progressive vj capture devices

If there’s latency or dropped frames, the interlaced lines can become visible over the top of your footage. You can remove them with a post process, but not so easily during a live performance. (one of those things YOU as a VJ would notice, that the rest of the audience might not).\n

To avoid this, you’ll have to send a progressive signal into the Mini Recorder. It supports 720p50, 720p59.94, 720p60, 1080p23.98, 1080p24, 1080p25, 1080p29.97, 1080p30. I use SwitchResX to force mac computer outputs to my desired resolution. Another downside is that the mini recorder does not support 16:10 aspect ratios; This forces me to crop or stretch the resolution when I’m working on a WUXGA projector. Simply put, if you’re on a Mac, the Black Magic mini recorder is really good, but far from perfect. (It supports some Windows machines with Thunderbolt). And as of posting this article. I sold my mini recorder and purchased a magewell. But something new or better will come out soon, or maybe it is already here (NDI?)

Bonus sectionSyphon / Spout, TCP, NDI, etc.:

Syphon (Mac) and Spout (PC) are graphics sharing pipelines that can share video across applications on the same computer, and with TCPSyphon or NDI, you can send that signal over a network (Gigabit recommended). These are fast and usually, do not require additional hardware (unless your on a stupid MacBook pro that needs thirty USB-C dongles for you to VJ). NDI is excellent if you’ve got your setup working. I like the UVC USB 3.0 devices because they are compact and BUS powered, but if you can haul an ethernet connection to your next event, then maybe NDI is your next best friend.

NDI to Syphon (Mac)

https://docs.vidvox.net/freebies_ndi_syphon.html

NDI to Spout (Windows)

http://spout.zeal.co/download-spout-to-ndi/

For Resolume users:

https://resolume.com/support/en/syphonspout

 

WHAT ABOUT MULTIPLE VJ’S???

I am the proud owner of a Roland V-1HD 4 input HDMI mixer.  It has BPM sync, midi in & out and it’s compact.

 

I hope this article helped. If you found any better devices or found this to be useful, please leave a comment or message. Thanks -Cornelius

 

Additional Resources:

Display Resolutions:

https://en.wikipedia.org/wiki/Display_resolution#/media/File:Vector_Video_Standards8.svg

 

Color spacing:

https://en.wikipedia.org/wiki/Chroma_subsampling

http://blogs.adobe.com/VideoRoad/2010/06/color_subsampling_or_what_is_4.html

http://www.clearone.com/blog/what-is-444-color-and-why-should-i-care/

https://en.wikipedia.org/wiki/Color_space

 

Encoding:

https://en.wikipedia.org/wiki/YUV

https://en.wikipedia.org/wiki/RGB_color_model

https://en.wikipedia.org/wiki/YCbCr

https://en.wikipedia.org/wiki/YPbPr

http://paulbourke.net/dataformats/nv12/

 

Codec:

https://en.wikipedia.org/wiki/H.264/MPEG-4_AVC

https://en.wikipedia.org/wiki/Motion_JPEG

 

Catcalling is Harassment Stickers #catcallingisharassment

20170624_093259

Project update:   Due to the paper stickers lack of weatherproofing (indoor only) I have pivoted into having a more “permanent” sticker made.  Introducing the #CatcallingIsHarassment eggshell stickers.  They adhere to smooth clean surfaces and become difficult to remove after about 5 minutes.  They are now stationed at a few local coffee shops in the Pittsburgh area. Thus far we made 12,500 stickers.

You can download the design in the previous post (below) to make your own.

20170622_111828

—– Previous post.

This isn’t in the realm of projection mapping, but I’m sure some night soon this will be brightly displayed on the side of a building.   For now I’m using this blog as a way to share the idea.

catcallingisharassment catcalling is harassment stickers

#CatcallingIsHarassment stickers

It only took $163.14 to make 3,000 4″x4″ stickers that 86% of them “disappeared” in less than a day.   I have since updated the design to incorporate the hashtag #CatCallingIsHarassment to help others share the idea and uploaded the files for anyone to download, share, or printing.  i.e. signs & stickers.

Catcalling Is Harassment Files

Catcalling is Harassment PDF only

Initially someone printed 500 Vinyl stickers for free, they were amazing quality, but limited, more expensive, easier to remove (in one piece), and more detrimental to the environment. For paper stickers I found this company had good prices and turn around time, I’m going to see what else is out there, but for now: https://www.printplace.com/products/promotional-and-event-stickers

Hopefully you can take it from here, spread the word, et cetera. Let me know if you find a better printer.

#catcallingisharassment #catcalling

EspressoAMano Projection mapping pittsburgh, projectileobjects, coffee shop window projection

Update and EspressoAMano open projection mapping project.

 I’ve been contributing to projection-mapping.org more frequently than my own blog and it has been some time since I last posted an update. For me, it’s a good problem to have, I’ve been busy…

Over the last 2 years I’ve been piling up a collection of unedited videos from a slew of projects that I’ve had the honor to be apart of.  What were once planned to be written detailed accounts for educational purposes, may well end up becoming a flashy promo reel propelled by high energy EDM pop music.  <- Or not.

It’s what I have seen everywhere else and while I understand it, it’s not in my immediate interests to create one, or to make bids on larger corporate events.  While I may consult  on such events from time to time, believe it or not, I’m in this for the fun of it.

A few days ago I finished an update for a local automated projection mapping installation.  The goal, take a cool local coffee shop and light up the upper windows at night. The ulterior motive for this? To see what others can do with the visual space when given the opportunity.

 EspressoAMano Projection mapping pittsburgh, projectileobjects, coffee shop window projection

Share the link, download the template, submit your work and it will be viewed publicly.  That’s it. https://www.evernote.com/l/Ag2xwxA4QRtJE4z8E8lZ7x8jbwDmkIZXrvM

 

Only time will tell what’s going to happen next with this installation, but the opportunity is something that I am excited about.  Check back in a few weeks or so as I will have a detailed update of how I made this installation possible and the simple “built-in” means of projection automation.

 

The location:

Espresso A Mano (https://espressoamano.com/)

3623 Butler St, Pittsburgh, PA 15201

Current Projector Operating hours (subject to change):

Mon-Thur:

9PM-11PM

Fri & Sat:

9PM-2AM

Sunday:

6PM-9PM

 

 

OMG 5 Tunnel Party

OMG 5 Tunnel Party Uploading to Youtube vs Facebook

Its been a while since I’ve a moment to blog and even longer since the first OMG parties, including our infamous U-Haul Dance Party (<-Video / Blog Post <-).  So we did it again, but this time we had a tunnel dance party. <- Why not?.  The video was quickly put together as a recap and shared on Youtube & Facebook. Interestingly enough, Facebook favors its own video content so heavily that within 24 hours of posting the Tunnel Party video we reached more than 5,000 people compared to our youtube page which only reached 5.   5000 is a 99900% increase of 5.  So why even upload to youtube?  Archival, SEO, and Quality. While facebook is the social sharing hub of our time, videos disappear just as fast as they arrive. They share with friends, and friends of friends, but don’t rank highly in search engine results. Lastly, to optimize streaming across all devices, facebook really dumbs down the quality of any video uploaded to it.  That being said, I hope you enjoy our tunnel party video as much as we enjoyed our tunnel party!.  Thank you to everyone who came out.

Raspberry Pi Zero projection mapping

Raspberry pi zero projection mapping

Update:  If the content is pre-mapped (meaning, made in After Effects, Photoshop, MS Paint, etc.) and rendered out at the projector’s native resolution.  The Pi Zero works great as a standalone seamless video looper by simply embedding this code into the rc.local file.  (I stored a video file on a usb flash drive and entitled it “Video.mp4”  You can also put your video file on the MicroSD that Raspbian is installed on.

omxplayer -b -o --no-osd --loop /mnt/usb/Video.mp4

Old —–

Built off of the 1st gen. versions of the raspberry pi, the Pi Zero is a $5 computer that seems fit for simple projection mapping and playback.  Thus far it’s limitations have simple fixes, but a lack of ethernet and additional usb ports is hoop that has to be jumped through. (My current setup is micro usb to 4-port usb 2.0 hub with nano wifi dongle & Logitech k400).