Imagine if Buildings Could Talk – Wrapup

You are going to want to get a cup or glass of a favorite beverage for this one – it’s long.


Project Timeline

  • 2015: Summer – Gayle Seymour contacts me about doing a projection mapping event on the facade of LRCH
  • 2015: Contact Jim Lockhart and Jonathan Richter (J & J) about working on it with me
  • 2015: NEA Grant submission
  • 2015: Fall – Sabbatical Application for spring 2017
  • 2016: Spring – Get news of NEA grant (awarded, but not fully funded). Awarded sabbatical leave
  • 2016: Summer
    • Meet with Blake Tyson about the piece
    • Setup Google Drive (shared docs with Gayle, Jennifer, and Blake), Dropbox (shared files with Jim and Jonathan), and Basecamp (communication with J & J mostly, but with Gayle, Jennifer, and Blake as needed)
    • Contacted projection companies about possibly doing the project
    • Late summer – MooTV got on-board as projection company with Travis Walker as project manager and Tim Monnig as systems engineer
  • 2016: Fall
    • Blake Tyson composes and records music
    • Developed themes for imagery
    • Started modeling the facade
    • Occasional meetings with National Park Rangers and LRCH Principal, Nancy Rousseau
    • Nashville meeting with J & J
    • Not much more besides answering questions and trying to make it through the semester
  • 2017: Spring
    • Started full-time production – finished facade and statues
    • January Nashville meeting with J & J
    • Demos and marketing images, several things to show what projection mapping is, but not imagery that would go into the final piece
    • Developed “School Life” section
    • Two visits to the Arkansas State Archives
    • Multiple meetings with Nancy Rousseau, Rangers, and working with potential vendors for Sound and Generator
    • Request for permission to use Will Counts photos (early April)
  • 2017: Summer
    • 3D production – finish School Life; develop Desegregation Crisis section with several re-starts throughout summer
    • Finalize vendors for sound and generator and work out scaffolding, ultimately finding a scaffolding vendor
    • Get permission for Will Counts photos and Raymond Preddy photos; Finish Desegregation Crisis section (August)
    • Ongoing communication with J & J regarding their sections
    • PROMISE youth camp workshop
  • 2017: September
    • All imagery assembled from the three of us artists
    • September 15th – Meet at MooTV in Nashville (J&J, Travis and Tim) to walk through playback, and designing the lighting looks
    • Window, door, and sconce drapes made
    • September 21st
      • Scaffold built; projectors, sound, and lights loaded-in
      • Windows covered from interior
      • Doors and sconces covered on exterior
      • Projector alignment
    • September 22nd – image mapping and alignment
    • September 23-24 – public shows and load-out
    • September 25th – 60th Anniversary event



Original Blueprints at the Visitor’s Center

The Content

The sections of the animation:

  • Opening – Construction and Statues (Jonathan)
  • School Life – academics and athletics over the history of the school (Scott)
  • Desegregation Crisis and Lost Year (Scott)
  • Close – the spirit of the students in school now (Jim)

Projector Point of View, students installing exterior drapery

Why did it take so long?

There are two main reasons it took months to get the work finished. The first is that 3D animation takes a while to do, especially if there is a lot of modeling detail. I spent a few weeks modeling the statues and, though the rest of the facade was relatively simple, I spent quite a bit of time on the model so it would match the photographs as closely as possible. Once I got into the School Life section, I found myself mired in the 3D details too.


Finished 3D Model of the LRCH Entrance

The second is that turning a general theme into imagery can take time. The School Life section was the first one I designed because I thought I had a strong sense of what I wanted it to be. As I pecked away at it by building or procuring models of books and sports equipment and then creating materials and lighting, I was not only burning time, but also slowly developing an idea of what the section should look like. On top of the 3D animation, I also spent about 10 hours at the Arkansas State Archives over two trips going though old yearbooks and newspapers. To finish researching the photographs I needed, I visited the LRCH library with help from Stella Cameron and went through some materials Gayle found at an estate auction. A lot of time went into bringing the 3D and 2D elements together as well as over two weeks of rendering the final animation.

The Desegregation section also took a great deal of time changing it from a theme to actual imagery. I wasn’t sure I would have permission to use the photographs made by Will Counts until it finally happened in mid-July (I originally asked permission to use them in early April). Through the summer I worked on several different ideas to illustrate the events of 1957-59, but was never happy with any of them. Either the ideas were too great in scope or they just didn’t look as good as I wanted them to when I tried them out.

When I got permission to use the photos I decided to celebrate the photos themselves with an overall look that was reminiscent of a project I did several years ago, but this version ended up being much better. Also, with having a strong sense of what I wanted to do with the photos, I was able to get the work done quickly. The actual production time on the Desegregation Crisis was significantly shorter than the School Life section, but it took even longer to get to the point of actually producing final imagery. The slowest part was creating the images for when the Little Rock Nine enter the school. Those photos exist, but are blurry and taken from too far away so I re-created them in a slightly illustrative style.

One of the best aspects of doing this project was that there was no client. Jim, Jonathan, and I chose to take our time to think about the piece and let ideas evolve. There was a lot of idea incubation time as well, which was useful to try an idea out and then let it sit for a while to see if it was right. The three of us normally have to think on our feet and get work done quickly for clients, but this project afforded us the time to consider what we were doing and be satisfied and proud of our choices.

Similarly, when there is no client and there is time, I am able to let my INTP-ness express itself. I prefer to think through a problem and try different approaches and I am willing to work through solutions in my head and let them go if they aren’t working. For personal work, this too often means that I may not necessarily finish a project, but since this particular project did have a real deadline, I was able to mix my inclination for mental play and get things done.


Saturday night audience


For me, the show started production on September 15th when I went to Nashville to program the lights and check playback. Jim, Jonathan, Tim, Travis, and I met for the first time in the same space to talk through playback of imagery and sound and see what we were capable of doing with lights. I had sketched out lighting ideas for each section and we started with that, but it turned out that Travis is a lighting designer too and was doing the programming, so we worked together to create the looks for each section. This process took about eight hours, but it paid off since we were able to just tweak some timing on-site at the school, rather than doing any more design work.

September 21st was load-in day for projection, scaffolding, sound, electrical generator, and drapery. Considering all of those elements had to come together in the same late afternoon, it went remarkably smooth. I created an itinerary so each of the vendors could make sure things got setup in a certain order. Rock City Staging installed an 8’ x 16’ x 8’ platform with roof and side covers. A/V Arkansas installed a sound system and supervised getting the generator, provided by RIGGS, located and power cables run. Our primary directive from the principal and the rangers was to keep all equipment out of sight so anyone taking a picture of the school would not have AV equipment in the picture. A/V Arkansas was able to accommodate by placing the speakers behind trees to the sides of the main entrance. The UCA Physical Plant assisted Travis and Tim in getting the 240 lb. projectors lifted onto the 8’ platform. The Physical Plant also setup barricades around the projection platform.


Scott (in too baggy clothes), Travis, and Tim

Shauna led a team of Film students to install drapery on the interior of the arched windows and I got some help from Jim and Matt Rogers, Film student, carrying lighting fixtures up to the fourth floor roof and to the sides of the facade.


Lights on the roof over the entrance

That evening we were able to power up the projectors and lighting equipment as it was getting dark enough to see them. The school was having a parents open-house so we had to wait until about 7:45 PM until we were able to install the outside drapery for the first time.


Students installing drapery, right – Jonathan and Jim

The rest of the night until about 3:00 AM Tim worked on aligning the projectors. The process is slow since there are multiple projectors and the alignment software on the projectors is slow (click to move a pixel then wait several seconds to see the change…).


Projector alignment

September 22nd was about testing the playback, lighting, and sound systems first and then mapping/image alignment. There was a home football game that night against rival North Little Rock so we could not turn off the building lights. Luckily, the projectors were bright enough to be seen over the building light. After running through the show a few times we let the sound guys from AV Arkansas go and turned off the lights. The rest of the night was about mapping the animation to the building.

We had some technical issues with a model I provided for mapping, but it was undiscovered until late. Tim worked through the night to get the mapping software to do the best it could with what it was given. OVERALL, the mapping looked good. Unfortunately, there was offsetting towards the bottom of the image that couldn’t be fixed, but wasn’t that noticeable.


Image mapping – Tim with binoculars checking his work

September 23rd and 24th were the shows starting at 7:30 and running every 15 minutes until 9:30. The final animation with credits was a little over nine minutes. I created a countdown animation, which was Shauna’s suggestion, so the audience would know when it would run again. The lighting console ran the show by sending a start command to the video/audio system and running the light cues.

The two nights started around 6:45 PM by installing the exterior drapery. Then we would wait for 7:30. Saturday night there was a jazz concert that preceded the show that started late and ended late so we didn’t start the show until nearly 8:00 PM. Sunday night there was an event at the Commemorative Garden across the street before the show, which ran on time and was designed to lead the participants and audience over to the school building to see the animation. Both nights were well attended, but Sunday night, to my surprise, had a bigger audience. Each night there were several people who stayed the whole time and watched the show repeat, which was strange and cool.

Sunday night we took down (struck) most of the equipment. The students, Chris Churchill, Steve Stanley, and Jim helped take down the exterior drapery and the lights. UCA Physical Plant came back to help take down the projectors for MooTV. A/V Arkansas also struck all of their equipment. Monday the platform and generator were removed from the grounds.

Mr. John Robert, facilities engineer, was instrumental in getting us into the building, out on the roof, and turning the lights off. I can’t thank him enough for his work and extra time he put in to help us.


Lighting check

Monday, September 25th, was the official 60th anniversary event and I was pleased to be invited. It was amazing to see and hear the eight surviving members of the Little Rock Nine, as well as experiencing President Bill Clinton’s keynote address.


Bill Clinton with the surviving members of the Little Rock Nine


I was overwhelmed by the audience response to the work. Several of my friends and colleagues saw the show and let me know how well they liked it. I was also approached by people from the community and LRCH alums that truly appreciated it and thanked us for doing it. Each person seemed to have a favorite moment for them and thankfully, the favorite moments were from each of the sections. Probably most of the comments were about the moving statues in the opening section; the tiger in the school life section was mentioned several times; when the Nine entered the building was another highlight; and the rainbow spheres were also a big hit. I was especially pleased with the emotional response to it. After working on it for so long it was hard to know whether it was really going to work or not and I think it did very well. Nancy Rousseau had seen the school life and desegregation sections on a computer weeks before the show and she really liked the whole show and I believe she truly appreciated the effort that went into lighting up her school.


Opening – Statues introduction

What struck me the most was how well Blake’s music and the imagery came together. I’ll admit that Jim and Jonathan worked harder at synching the imagery to the music than I did and it paid off. The place I worked the most on blending the imagery and the music was the end of the desegregation section where the music brightens while we see the capitol statues rush past. The music gave the imagery an emotional depth that was very satisfying.

The lighting surrounding the projection area was awesome. It was part of my original vision and it worked out so well. The lights were bright and colorful and not only expanded the projection area, but also removed the rectangle by fading it upwards and to the sides. The scale of the piece increased dramatically by incorporating the lighting.


School Life

I knew we had something special based on Shauna’s response. She hadn’t seen it prior to the first show Saturday night and she was blown away. She was reluctant to support the project at first (in 2015) because she knew it would be big and that I would be working on it for a long time, but after seeing it the first time she was ready for me to start doing the next one, wherever that may be – even if it was back to a stack of boxes.

Projection mapping events are special because they are unique to their locations. I’m pleased to show the video of the work, but it does not do it justice. The vibrant color, bright light and images, and the overall scale of the piece does not translate to video. Similar to watching a play or concert on TV – it’s just not the same.

The project was highlighted by a press conference and a kick-off event preceding the show on the UCA campus, an interview in the Democrat-Gazette, an interview on KTHV 11, an interview for The Echo, and an interview on Spotlight. It was also covered by UCA media before and after the event and was featured in the UCA President’s Update email newsletter.


Desegregation Crisis section – Little Rock Nine entering the building

Lessons Learned

I refuse to nitpick the piece. I’m proud of our work and I know how it could have been better, but overall I am more satisfied with this work than practically anything I’ve done in the past. Having said that, there are a few things that I’d like to document as far as lessons learned.

  • We should have done more cool stuff. The moving statues were a HUGE hit and though we originally planned to do a lot more with them, we only animated them once. It would have been nice to move them at least one more time. The rainbow spheres were also a big hit. Projection mapping events commonly incorporate playful animation of the architecture and while we did some, we could have done more. It was mostly my fault. I was so worried about respecting the events and school that I forgot to have fun. I also found that through my mapping simulations that the building was hard to transform. The projection area is already quite 3-dimensional so I found it difficult to mess with it much. It was a good lesson to learn based on audience responses and will definitely incorporate more transformational animation in future projects.
  • Projection mapping alignment took quite a long time. I should have worked with Tim more in the weeks prior to the show to go through the details of the mapping process. I was providing a 3D model, but not really getting into the rest of the process. I assumed that at the very least we would do a flat projection on the building since the rendered animation was from the point-of-view of the projectors, but as I saw the mapping process I realized that I was being naive. Tim and I are currently working through the process to see where we can make it better in the future and why we had some model compatibility and scaling issues.
  • The lighting was so important to the look of the final piece. It was a very effective way of expanding the scale of the projection and softening the rectangular shape. I definitely see using that technique again.
  • Get the sprinklers turned off. Though Jim would probably not agree, I’m glad we were there when the sprinklers went on at 1:00 AM or so in the morning. I ended up covering a couple of them with buckets with weights on them.

Close – Lines of Light

Hours and Emails

I logged 623 hours on the project from early January, 2017 until a few days after the event. I created a Google Form that I kept open in a browser window. It had a line for me to say what I did that day and a choice of a number from 1-10 for the number of hours spent. My most consistent hours were from late January through May and then again in mid-July through August. Though I did work in June, it was spotty due to some outside projects and taking some time off. The hours included doing graphics work as well as time spent on emails, documentation, meetings, and other tasks related to the project.

There are 496 emails in my Central High folder.

On Basecamp we had 20 discussions with 152 total messages.

There are 39,217 files on my computer related to the project. The number does not account for duplicate files, such as photos on a local hard drive and a copy on Dropbox. Nearly 2/3rds of the files are rendered frames (30 frames per second of animation).

Sabbatical Leave

I was awarded a sabbatical leave for the spring semester of 2017 to work on the project. The leave was needed for the project, but also something I NEEDED to do for myself.

I needed a break from teaching and service activities. I’ve been teaching each fall and spring semester at a university for the past 17 years. That may be grand for some professors, but I have such a love/hate relationship with academia and large organizations that I need breaks. I have also been going through a transition in my career both as an educator and working professional and needed some time to take stock in where I’ve been and where I would like to be in the future. A big part of applying for a sabbatical leave of absence was to give myself some time to recharge.

Similarly, I needed some time to focus and do some deep thinking about a creative problem. Teaching classes, helping colleagues on their creative projects, and doing short professional projects are fine, but they do not give me the opportunity to do my own creative work. The projection mapping project made me do the things I like to do, such as research, create 3D animation (modeling, animation, lighting, shading, rendering, compositing), and do some creative problem solving regardless of the content. In this case, in terms of how to communicate ideas of; the history of a school, school spirit, racism and the desegregation crisis, diversity, and education. I don’t believe I could have done the work while also carrying a 4-4 course load and the service expectations at UCA. I’ve done several other projects during a regular semester that were highly compromised due to the time and mental effort taken away by the needs of the job of professor.

Geek Stuff

Just a list of software used on the project:
3D: Blender (Scott), Maya (Jonathan), Cinema4D (Jim)
2D: Affinity Photo and Designer, Photoshop
Compositing: After Effects, Fusion, Premiere Pro
Video Editing: Premiere Pro, DaVinci Resolve
Projection Mapping: Pandora’s Box

We compiled the show and synched sound in Premiere Pro. The project file was on Dropbox and we would text each other when we had the file open so someone else wouldn’t open it and cause a problem overwriting it.


Close – Rainbow spheres

Thank You

Thanks so much to everyone who helped and supported the project. Blake, Jim, Jonathan, and I could not have pulled it off without these people.

First, Gayle Seymour, Associate Dean of the College of Fine Arts and Communication at the University of Central Arkansas who was my partner on this project. This whole thing is her fault. She recruited Blake and I back in 2015. Throughout the process she sheltered me from the politics surrounding the 60th events and she dealt with the things I needed to worry about, but couldn’t do anything about, for instance getting 24-hour security for the equipment on-site and dealing with contracts and funding sources. There are so many more things that she did, including working with Jennifer Deering, Grant Writer in UCA’s Sponsored Programs department, to write grants and organize many more events besides the projection mapping event.

I also want to say that Tim Monnig and Travis Walker at MooTV were instrumental in making this happen. We were in contact with each other for 12 months to make sure everything was going to work properly. They are great to work with and hope we can do it again soon. They mentioned projecting on the Capitol Building, which sounded cool:)

The following is a list of those who contributed to the project:

  • Produced by Gayle Seymour, Jennifer Deering
  • Animations by W. Scott Meador, Jim Lockhart, Jonathan Richter
  • Projection by MooTV – Nashville, TN. Travis Walker, Tim Monnig
  • Musical score – The Surface of the Sky by Blake Tyson
    • Performed by UCA Percussion Ensemble – Carter Harlan, Victoria Kelsey, Jarrod Light, Bradlee Martin, Scott Strickland, Stephen Timperley
  • Sound by A/V Arkansas
  • Projection Platform by Rock City Staging
  • Lighting Equipment by UCA Theatre, Greg Blakey
  • UCA Film Student Crew – Rebecca Koehler, Melissa Foster, Jonhatan Nevarez Arias, Matt Rogers, Zack Stone, Takuma Suzuki, Dawn Webb
  • Projection Drapery by UCA Theatre, Shauna C. Meador, Sidney Kelly, Donna Dahlem, Hannah Pair, Autumn Toler
  • UCA Physical Plant – Dustin Strom, Jeremy Davis, Dale Gilkey, David Mathews, Tom Melrose, Skipper Pennington, Joe Richards, Joey Williams
  • National History Sites – Robin White, Tarona Armstrong, David Kilton, Marchelle Williams, Jodi Morris, Chelsea Mott, Toni Phinisey-Webber
  • Special thanks to:
    • Nancy Rousseau, Jane Brown, Stella Cameron, Scott Hairston and the LRCH Student Council, Mr. John Roberts
    • Kristy Carter
    • Carri George and Arkansas PROMISE staff
    • University Relations and Creative Services
  • Student images used in closing section
    • Ashlyn Sorrows – Stand Together
    • Shelby Curry – Human
    • Madison Bell – I am Human
    • Erbie Jennings III – Laying the Foundation
    • Charis Lancaster – We Come In Pieces
    • Mae Roach – Released from Chains
    • Joah Gomez – the world is in your hands
  • Funding provided by
    • National Endowment for the Arts
    • National Park Service
    • Mid-America Arts Alliance
    • Arkansas Arts Council
    • Department of Arkansas Heritage
    • City of Little Rock
    • University of Central Arkansas
  • Will Counts photographs provided by Vivian Counts, Bradley Cook, and The Arkansas Arts Center
  • Raymond Preddy’s photographs provided by UA Little Rock Center for Arkansas History and Culture

LRCH Teaser

Needed to get a video to local media to tease the projection mapping project and show something at a press conference Monday, August 28th. I don’t really want to show any of it until it premieres on September 23rd for a couple of reasons. First, the piece is a little over eight minutes long so showing practically anything gives away a lot IMO. Second is that part of the power of projection mapping is the illusion, often surprise, which can only be fully experienced at the event – on the building – live.

I know we need to build an audience for this, but I wish there was a better way than previewing the work. So, this video shows some production process and two short moments that are in the final piece. The rest are tests or something that might make it into the final, but not necessarily the way it looks in this video.

Live motion capture performance

I’m still hard at work on the Central High projection mapping project. However, I wrote 90% of this post months ago, but just let it sit around and I really wanted to get it published – so here it is.

Back in April a student of mine, Anna Udochi, and I installed a live motion capture (mocap) artwork for her senior BA exhibit in the Baum Gallery on the UCA campus. The installation featured actor Jordan Boyett in a mocap suit performing a character created by Anna. Jordan would interact with the gallery audience as they found their way into the room dedicated to the installation. He could see and hear the people looking at a large TV in the gallery space. The TV showed Anna’s character, Class Gales, standing in a pastoral environment. Jordan interacted with his audience by speaking to them and making specific comments, such as mentioning what they were wearing, to let them know that he was a live character. He then had a short impromptu Q&A with the viewer, often about the biography of Class.

Gallery space

This project started in the fall semester of 2016. I worked with Anna as her mentor for her Art department 4388 course. She wanted to dig into 3D animation especially character design. After throwing different ideas around I mentioned that I have a motion capture system and that, if she was interested, she could use it for her project. Additionally, I thought that doing a live performance of her character design would also allow her/us to learn a game engine, which would give her the best realtime rendering and work with the mocap system. To my surprise she was very interested so we developed a timeline to make it happen.

For the rest of the fall semester she learned to model and rig a character appropriate for realtime motion capture and export to the Unreal game engine. The character work was done in Blender. She also used Nvidia’s cloth tools to setup the character’s veil and cape. At the end of the semester we got Jordan to come in and wear the suit and test the whole pipeline. By early spring, Anna created a 3rd person game so Class could run around the countryside using a game pad. This allowed her installation to have something running when Jordan was not there driving the character.

Class Gales in Blender

Class in his pastoral environment as seen in front of the Unreal Engine editor

To make the installation happen we ended up throwing a lot of gear at it. Jordan wore the suit to articulate the character. We needed to have audio and video of the gallery space for Jordan to see and hear in the adjacent room. Then his voice needed to be heard in the gallery. Last, we needed to send the realtime game engine to the TV in the gallery and be able to see it in the performance room. It’s a mix of my own equipment and school equipment. I didn’t want to use any school equipment that would normally be checked out by Film students since we would need the gear for most of April.

Here’s how we did it:

  • The mocap suit sends data to software created by the same company that makes the suit. That data is then broadcast and picked up by a plugin running in Unreal Engine. Unreal is then able to drive the character with the constantly incoming data.
  • In the gallery we used a flat screen TV that was already hung rather than a projector.
  • A little Canon VIXIA camera was mounted under the TV and a speaker was set on top of the screen. The camera sent an image and sound of people in the gallery to Jordan, who is in the storage/work room next door. The speaker is for the actor’s voice to the audience.
  • The computer was an old Windows PC I built several years ago for rendering that I let Anna use to do the Unreal stuff. It barely ran the mocap software and Unreal at the same time, but it somehow made it happen and never crashed.
  • The computer was cabled to a projector via VGA and out the pass-through to the TV. Worked, but no audio like HDMI. The TV’s audio input that links with VGA/RGB was optical only, thus I put in a separate speaker instead of using the TV’s.
  • Audio from Jordan: Mic into an old signal processor we were getting rid of at school. It does a pitch shift to his voice. From there into my old little Behringer mixer. I needed an amp since I was using a passive speaker, so I used an old Pro-Logic tuner I had…
  • The TV for Jordan was an old Dell monitor that had a million inputs, but the internal speaker wouldn’t work so the audio coming from the camera was going to a little Roland powered speaker.
  • AV wiring between the two rooms was through three long BNC cables we had that I just uses BNC to RCA adapters at each end.

Jordan in the mocap room next to the gallery

What could have been better…

If the computer was more powerful I would have run Jordan’s mic into it and setup Max or Pure Data to synthesize his voice rather than using a dedicated piece of equipment to do that. We would have tracked down two long HDMI cables for the gallery screen and camera, which would have simplified our cabling. We spent almost no money though. Anna bought a USB extension cable for the game controller, but that was it. Two other lessons learned were that the Canon camera did not have a wide enough lens so Jordan did not have a view of the whole gallery space; and I should have had him wear headphones to hear the audience rather than a speaker – we fought with hearing the audience through his microphone a little.

Next steps

Realtime facial mocap on a budget. I think I’ve got it figured out, but did not have the time to implement it. Also, I see doing a lot with Unreal after I learn it more. It can render some beautiful stuff in realtime and it is a great platform for VR. The mocap suit is also being used for VR so your avatar can have a body when you look down.

Final thoughts

The project was very successful. It was seamless to the audience and we were able to see real responses from people of all ages as they were often startled (“who’s talking to me?”) into realizing that there was a virtual character interacting with them. The kids seemed to get into it with no problems, while several adults were freaked out or suspicious while talking to Class. I am also very proud of Anna’s work and Jordan’s ability to learn to improvise and drive the character.

The reason I did this project was that I originally purchased the mocap suit to do realtime character stuff, as opposed to using it to record human movement, but I hadn’t been doing very much. I also wanted to start learning Unreal Engine. Luckily Anna was up for it and I knew she could pull it off working mostly on her own for the character creation and world building in Unreal. Originally she just wanted to learn 3D and make a character she had designed, but she was really into the idea of making the character live. The downside was that we could only meet a few times during the spring semester since I was on sabbatical leave. I also wasn’t able to put as much time into helping with Unreal as I wanted to due to the Central High project needing most of my attention.

This was the first time since I came to UCA that I’ve worked with a student on a shared project and research. Technically it was her work, but she was also doing a lot for my own creative research interests so rather than just being a mentor to her, she was also a research assistant for me. Until this project I’ve really only been in a mentor role with students at UCA as they do their work and I do my work completely separately. The project with Anna also finally picked up where I left off with the realtime performance mocap work I was doing with my grad students and faculty collaborators at Purdue back in 2001-2004.

Late in 2016, this project got a lot of buzz. In their interviews one would think no one ever thought of using a realtime mocap’d actor before them, but their work really was amazing in scope and bleeding edge. I think we could meet somewhere in between to make live mocap performances viable for lots of different events and art forms.

Thanks so much to Brian Young, Director of the Baum Gallery, for helping us with the installation!


LRCH Update – BTS

It’s been a while since I’ve updated. I’ve spent most of my time working on the Central High project, of course, but in late May and early June I took some time to do some outside projects and personal projects. One of the main reasons for a lack of updates is that I just don’t want to show anyone the work – it’s a surprise!

Progress Report

I had to do a progress report for the NEA so some of the following is a little stiffer than usual, but it was easier to copy it and tweak than re-write.

The animation has four parts; opening “construction”; “school life”; desegregation crisis; and a “future” themed close. They are not equal in length since they follow sections of the music and have different amounts of importance to the piece. The “school life” section is 98% complete as well as some transition pieces, which together make up about a third of the eight minutes of music. The opening section is well underway and will be the next section to finish. Desegregation is also well underway, but is the most reliant on outside resources, such as photos and films, which slows the process. The close is still in pre-production as it gets its identity from the other sections and as the artists get more input from the school, park, and community.


A shot from the School Life section. Perimeter lighting is not finalized nor is the blue color in the windows.

To create the school life and close sections, the artists interviewed LRCH student council representatives and researched historic school newspapers and yearbooks found at the Arkansas State Archives as well as the LRCH library. The librarian and Gayle Seymour were also able to find resources from alumni. The librarian produced a bound version of the book released at the opening of the school in 1927, which has been helpful for the opening section. The National Park also had several resources for the opening.

The desegregation section is the most important and will have the most expectations by the audience so it is getting the most attention. The artists have run into several copyright issues as far as using iconic photographs from the crisis so they are compiling as much original media as possible and legal and will design the rest of the piece from a more artistic theme rather than a documentary theme, which fits with the overall mission of the piece.


The team worked with the LRCH Principal, Nancy Rousseau, to determine what could be covered for projection. The dark woodwork and clear glass on the doors and arched windows create virtual holes in the projection so must be covered with a lighter material that matches the surrounding stone. The doors and windows cannot be covered during the daylight hours due to visitors to the site taking pictures of the building. It was agreed that the arched windows can be covered on the interior, which makes it possible to project onto the glass panels. The wood in the windows will still be visible so the projection designs were changed to accommodate. The doors will be covered by an exterior flat drape that can be hung minutes before the event. That drape is currently in the design phase to determine the best way to attach it to the building quickly. The exterior wall sconces will also be covered at the same time as the doors. Those covers are also in the design phase.

Logistics (re-written from the NEA report)

We decided to handle the generator and scaffold for the projectors locally and that’s turned into quite a drawn-out process of qualifying the equipment. Also, the funding source for these and the sound system wanted multiple quotes so we are drawing it out with three vendors… Hopefully this part will come to a close in the next day or so.

Something Unexpected

Thanks to this blog, I was contacted by a member of the Morehead Planetarium at UNC Chapel Hill. He was interested in using my LRCH model for their upcoming production that focuses on the American South. Pretty cool.

Geek Stuff

Software and hardware being used to do the project:

Blender – 3D Animation. Easily my favorite computer program and I wish I could use it for practically everything I do on a computer, I just like working in it because it is so customizable and responsive.

Affinity Designer – vector-based texture maps and working with vector art, such as logos. I hate Adobe Illustrator with a passion – just never liked it. I originally learned vector graphics on Freehand, but it was bought and killed by Adobe years ago. I’m not much of a vector graphics person anyway so if the program doesn’t behave like I expect or is slow then make it go away. Designer is fast and easy to use and is especially friendly to cleaning up vector graphics.

Affinity Photo – raster-based texture maps and photo manipulation. I haven’t opened Photoshop in over 6 months. Photoshop is probably my favorite Adobe app, but Photo just feels nice. A few things that I think work well compared to Photoshop: great vector integration – it’s like a healthy marriage between a raster and a vector program; all adjustments, such as levels, are treated like adjustment layers that can affect everything below it in the layer stack or a single layer; it’s fast.

After Effects – compositing 3D rendered and 2D images. Using this more out of habit than anything else and I’ll be sharing projects with my fellow artists.

Apple Motion – I used this on a couple animated textures for the school life section. It’s super fast and I have always liked it. My only gripe is the timeline can get crowded.

Premiere Pro – using to compile all of the rendered sequences and audio. Sharing this with my fellow artists, otherwise I would not use it.

DaVinci Resolve (14, latest beta) – used to create an photo sequence for the school life section. I really like 14. It finally has a fast and responsive UI. The sequence had three areas on the screen for photos so I used three tracks and transformed the photos on each (position, scale). Track 1 for left, track 2 for center, track 3 for right. I started doing this with Premiere, but it was sluggish with the high res photos. Resolve handled them easily. I assumed FCPX wouldn’t be good for what I was doing, but later, I found it would have done a great job and was faster than Resolve. Here’s why I know that.

I built the photo sequence in Resolve expecting to composite it on the 3D imagery in After Effects. I thought I would use the alpha channel to handle times when I needed pure transparency. Turns out Resolve can’t export a full timeline with transparency, but it can with individual clips. I needed the full timeline!!! I tried sending the sequence to Premiere via XML and AAF. The AAF failed to import, but XML did an okay job, except that the far left and right images were not in the correct place. Same issue with FCPX, left and right were moved outward. I started moving them back in FCPX just to see what would happen and I found it to be super fast and though there aren’t tracks technically, they sure looked like it in the timeline. Since I was multiplying one of the sequences in AE, I decided to just change the background to white in Resolve – done. The other was a hard light blend mode so I had to make its background medium gray – done.

Lesson learned – Resolve rocks so far, but don’t expect an alpha channel on the full sequence.


1st gen 5K iMac – Blender modeling and animation and all other apps. Best monitor ever – so crisp. Blender can handle custom DPI in its interface so it looks so smooth and easy to read because I push the text and buttons up a little more than 2x dpi like other apps do.

Downside – pixel doubling makes lower-res images look soft. The HD resolution we are using make the images small when viewing 1:1 in graphics programs. This monitor is made for 4K production so HD is small…

Custom Linux PC – Blender rendering using two GPUs for acceleration. Technically a faster computer than the iMac for multi-threaded apps because it has 6 hardware CPU cores. I should try After Effects on the Windows side of it, but I despise working in Windows – I only have it there to learn VR via Unreal Editor and use my mocap system.


LRHS Projection Mapping – Animation Experiments

I animate and render what the projector will playback and then project that animation back on the facade model, that has a similar texture to the real building, to simulate what it will look like on site.

The first animation has three statues moving their arms. After starting the rendering process I went for a walk (for those new to the blog that’s what the name means render + walk because you can’t do much else on the computer while rendering). It occurred to me that when this is projected onto the building, the statue arms will be quite a distance from the actual statue due to the facade’s depth. This isn’t much of an issue when looking at the building from front-center especially near the projector, but off-axis I felt like it may suck.

So I rendered a view off-axis to check.

I didn’t like it for two reasons. One, my original hypothesis was correct and the arms are pretty far away. This is an issue for about a third of the crowd thanks to the trees that force the audience towards the center of the viewing area, but I still don’t like it. The other reason is that any illumination on the actual statues makes them stand out as statues so I feel like we won’t be able to really remove them like I hoped. The side view does look cool even without illumination on the sides of the pillars and arches. It’s possible to project onto them too, but beyond this project’s budget.

So I created a new animation. This is better in terms of making it so the statues are seen when I want them to be seen. However, there is a moment when I have the statue “niches” rise up behind them, but it’s too late, they can already be seen. The lesson is that as parts of the building are highlighted or animated they need a strong silhouette – subtlety will be lost as soon as there is any light on them.

I’ve left the exterior lanterns, doors, and windows their natural color, which is dark, on the projection model for now. It is our goal to cover those with a material that reflects light better.

Here’s a fun experiment… A little bit of depth shading on a blueprint.


Geek stuff warning

When I was preparing the model to simulate the projection on the building I found that some of the proportions of the statues were off by too much to let go. Thanks to some new photos I took of the building I had more modeling work to do to get it right. I had to spend some time moving parts of the statues around until they properly aligned with the real statues. I also tweaked the building, windows, and doors a little. Was a one step forward, two steps back moment, but it looks a lot better now and I have a lot more confidence in the projection.

The animations above were 750 frames each. Rendering them and then rendering the projection simulation was 4500 frames. Plus some re-rendering sections after deciding to make some tweaks. I use two computers to render. One is a Retina iMac and the other is a custom-built Linux/Windows PC. The iMac renders using its CPU (4 CPU cores/8 hyperthreaded cores) and the PC renders using two Nvidia GPUs. In some cases the PC can render four or more frames for every one the iMac can render because the GPU acceleration is so great.

Unfortunately/fortunately the Blender Cycles developers have been working hard on the GPU acceleration including, BTW, developers at AMD working on making it so Cycles is not limited to Nvidia GPUs. I say unfortunately because on one of the animations I found the PC Cycles render was crashing every 40 frames or so. It’s a sad morning when you see that the faster computer hadn’t been rendering for the last 6+ hours…

I don’t have time to troubleshoot the issue. It’s a mix of Blender/Cycles and Nvidia software and it’s not that bad in the grand scheme of things. To deal with it I decided to dust off a python script I wrote several years ago for a compute cluster we had at UCA. It created a job script for the distributed computing software. I was able to simplify it quite a bit and have it spit out a shell script (like a batch file for you Windows weirdos) that I could run so that Blender would render each frame as a new job rather than one job rendering all of the frames. Essentially it changes this one line that I manually type in a terminal:

blender -b blendfile.blend -a
(this tells blender to start without a UI to save resources and then render the animation based on the project’s settings)

To this listed in a shell script that I start by typing

blender -b blendfile.blend -f 1
(render frame 1 based on the project’s settings and then close Blender)
blender -b blendfile.blend -f 2 (then render frame 2)
blender -b blendfile.blend -f 3 (then render frame 3)

Works like a charm. I could make the python script do a lot more tricks, but for now this is nice.

Last, Blender has a simple method of allowing multiple computers to render the same animation without using a render management application. Set the output to not overwrite and to make placeholders. A computer will look for frame 1 in the folder where the rendered images are saved (the output folder) and if it sees it then it will look for frame 2, etc. When it finds a frame that hasn’t been rendered it will create placeholder image, render, and replace the placeholder with the finished image. Each computer can claim a frame as they go, which is nice since one computer renders so much faster than the other. After Effects works this way too if you use multiple computers to render.

Since I’m not using a management system there is no check to make sure a frame actually gets rendered properly so I also wrote a python script back in the day that looks for frames with zero bytes to tell me if there were some bad frames. I might automate that with my other script, but I don’t want to dedicate the time to that right now. The macOS Finder does a nice job of listing “zero bytes,” which stands out in a list, or listing by size, so I’ve manually deleted bad frames too. To render those bad ones after deleting I just run the first command with the “-a” to find missing frames and render.