Metallica – part I.5, after S&M

Prior to late 1998 I had done some freelance design and architectural visualization and worked at a theater consulting firm, but the PMO Christmas show catapulted my career as a media artist thanks to its connection with MooTV. After S&M, I spent the next 2 years working on high-profile gigs and developing a lasting friendship and partnership.

In that short time I worked on gigs for companies and events, such as:

  • Lucent Technologies corporate event in the Bahamas
  • The Robin Hood Foundation annual fundraiser in New York City
  • Avaya Communications corporate event in Hawaii
  • Country Music Association Fan Fair in Nashville (now known as the CMA Music Festival) where they have both kinds of music
  • Tim McGraw 1999 Tour (did not travel with the tour)
  • 1999 PMO Christmas Show with Robert Kovach designing scenery and I did the projections
  • Tim McGraw and Faith Hill 2000 Soul to Soul Tour (did not travel with the tour)
  • Metallica 2000 Summer Sanitarium Tour (I went on the first leg of the tour)
  • The 2001 Rock ‘n Roll Hall of Fame Induction Ceremony in New York City

I travelled a lot, got paid well, and I learned a lot about being a professional in the entertainment industry. In the summer of 2000 I worked as an in-house freelancer at MooTV, where I created graphics and acted as a project manager on a couple of gigs. We were also working with a production/management company that had a lot of potential, but unfortunately imploded due to having too many industry veterans and outsiders trying to work together.

By the winter of 2000-2001 things changed. Jim Lockhart and I had developed a great friendship and we had different ideas of what we could be doing with our talents compared to what was going on with MooTV. Jim worked for MooTV as an employee and I was an independent. He had increasingly frequent disagreements with Scott on how to run the business, so he put himself in a place financially where he could take some risks. By the early spring of 2001, Jim left and started his own company called Emagination-Media. I chose to work with Jim, of course, and though I had no plan of breaking my ties with MooTV (naively thinking I was Switzerland for some reason) , that’s the way it ultimately went down. The three of us had become so close that we were incapable of talking to each other like professional adults and we let things fall apart much like a broken relationship in a family. The more I have thought about it over the years, the more I think I could have done something to make things go smoother for all of us.

Jim and I have worked together ever since on various gigs and we consult with each other on a regular basis from things like video and graphics workflow, to beer brewing.

Ten years went by until I got a call from Scott to do some work again. We didn’t even come close to talking about the past. Instead, our conversations were like we had never taken a break working with each other. Nowadays I do not travel to gigs like I did, but I work with him as he needs my talents in 3D animation and concert design.

The same year Scott got back in touch, so did Robert Kovach – pretty cool. We did spend some time catching up. We didn’t have a weird break, like with MooTV, instead we had just gone our separate ways after grad school and got busy doing other things. Over the last few years I have worked with him on many different scenery designs for live events including Canon’s booth for the 2013 Consumer Electronics Show (CES).

My next, and probably last, Metallica post will be on the Summer Sanitarium Tour. I have to gather some images and video first. Be ready for bad catering, a flood, private jets, equipment breakdowns, and a jet-ski accident.

Advertisements

Metallica – part I, S&M

One of my Introduction to Film students discovered that I worked on the Metallica S&M (Symphony and Metallica) concert. He wanted me to talk about it. I told him I would do a blog post on it. Here it is.

His first question – How did you get the gig?

How I got the gig…

Take a moment of silence for the two birds that died for this one blog post.

The 1998 Purdue Musical Organization (PMO) Christmas Show is where the story starts and I don’t want to blog about the Christmas Show again, even though it was huge, and the show the next year was pretty huge too. In the spring of 1998 I was a graduate student in scenography (a theatre term for production design) at Purdue University. I was asked to co-design the PMO Christmas Show with the manager of the Elliott Hall of Music on Purdue’s campus. The Hall of Music is a 6000 seat theater designed with the same consultant that helped design Radio City Music Hall in New York so it is very similar in size (100′ wide stage for instance). The Christmas Show had a budget of approximately $250K and its proceeds funded the Glee Club’s (yup Glee Club – Purdue had no Music major at that time, but had one of the biggest music programs in the country – weird) travels and programs for the following year. The show sold out 6 performances every year and drew people from all over the region. It was also featured on PBS on a regular basis during the Christmas season. Just a reminder – 6 shows at 6000 people = 36,000 people. Ticket prices varied based on distance from the stage, but nevertheless, this concert made a lot of money.

The show required a lot of physical scenery, which were sets built and painted during the summer. When I was brought onboard I recommended that we use projections as part of the scenery. The co-designer/manager, Steve Hall, drew a crude sketch of how he wanted the people and existing painted drops to be used and he wanted lots of pictures of Santa Claus. I took that brief and turned it into a set and decorations. I then worked with the PMO grand dragon, Brian Breed, to develop imagery ideas for each of the songs. Throughout the summer, I worked with fellow grad student and now professional scene designer, Robert Kovach, to create all of the physical scenery. We mainly painted, while Ron Clark and carpenters built the scenery. Throughout the fall semester I created imagery that would be projected during the show.

DesignRendering        Christmas01 1SantaDesign

At some point during the summer I took a road trip with the video director at the Hall, Bill Callison, to visit with some guys in Nashville, TN, who would lead the playback and projection equipment and video logistics for the show. The guys were Scott Scovill, owner of MooTV, and Jim Lockhart, compositor/artist and essentially only employee at MooTV at the time. We walked through the problems – 66′ wide by 24′ tall screen space – needs to be bright enough for a live staged event – need playback  – the image size required multiple projectors and some edge overlap/blending.

MarchOfTheToySoldiers

JimAtTheControls

Jim worked out the overlap and blending. I created the imagery at a single raster size and then we ran the full image through three After Effects composites, which would break the image into three different sections with appropriate gradients that would blend together properly when overlapped using multiple projectors and video servers. Nowadays the playback software does this automagically.

We had three pairs of projectors. Any one of the projectors were more expensive than my house… One of each pair was a digital projector with a short throw lens. It was crazy expensive just by itself. The analog projector below it was dialed in to match the digital projector since it had more image warping capability. By the time all six were aligned and properly blending we had a total of 33K lumens on the screen. FYI, a typical classroom projector runs at 3K lumens.

We played the imagery back on three DoReMi video servers and triggered them using a single Dataton timeline. All this stuff can be done by elementary students now, but in 1998 we were a select few in the world that could pull off such a project. Pretty cool.

Now to the point – I impressed the guys at MooTV. They saw that I could create imagery in 2D and 3D very fast and at a high level of quality. Scott called me the “Art Factory.”

Images: design sketch – me putting details to Steve’s overall ideas. Maquette of one of the 12′ Santa sculptures I made. The maquette is made with cheap paper-mache material from Hobby Lobby. The set as seen on stage. The set with with a large part of the Glee Club and the animation of the March of the Toy Soldiers I created in 3D. Jim Lockhart in our lair.

Fast forward 4 months.

Metallica and the San Francisco Symphony

I had read about this project in Entertainment Weekly. I was excited because I had been a Metallica fan since I was 13 years old and I loved classical music. Out of the blue, I got a call from Scott Scovill. He said he had a gig he couldn’t talk about, but was with a symphony and a famous band. I said Metallica. He said yes. I said I was in, what do you need? He told me the brief, which was to bring Mark Rothko – like paintings to life. They had tried a few things, but they were not working out and they thought I might have a different take on it. I did. My art abilities and training in art history became very useful and I threw together a couple of demos. It was a little over a week before the show. I suggested that I be there on site rather than working from Indiana. Scott agreed. A few days and animations later I was on a plane to California.

A side note about computers. Laptops were popular at this time, but had very little power to do animation. I had recently built a nice computer based around the best Intel server/workstation class processor available and Windows NT 4. I needed to take it to the gig. My wife, Shauna, made me a carrier for my desktop CPU case so I could travel with it as if it were carry-on luggage. This was pre-9/11 so I made it, but please imagine me walking though an airport with a large desktop computer in a padded bag. Awesome, but it could never happen again. There is another story I could tell regarding getting a monitor at the gig, but I will save it for now. It became a trend (aka, running joke) with MooTV gigs.

The gig was on Wednesday, April 21 and Thursday, April 22, 1999. We arrived on-site on Saturday. Sunday through Tuesday we setup the video gear, which included an 18′ tall x 24′ wide LED video wall and several portable racks of video equipment. In 1999, that LED screen was worth $1 Million. UCA has something similar in its football stadium now – didn’t cost that much… John Broderick (J.B.) was the lighting and scene designer. He saw the screens being assembled and said that they should not be completely assembled. I immediately became a fan. The wall was assembled by taking screen sections and attaching them vertically into narrow columns hung by chain motors and then connecting them horizontally. He said don’t connect them horizontally. Let’s adjust the height of each column so they do not align. Then let’s rotate each column so they are not flat to each other. It is not a rectangular screen, but a series of vertical columns with imagery that spans across columns in a dis-jointed manner (too academic a description?).  I got what he was after and I was invigorated – and it was Metallica of course so I was on top of the world.

JB looked at the imagery we had created by Sunday. He identified animations that should correspond with certain songs. He wanted an old-school liquid light show for the opening number (Ecstasy of Gold and The Call of Ktulu), so we sent our camera man out to shoot it with a hippie in San Francisco. He also wanted a few other things that we needed to create there on-site. Jim and I worked as much as we could to create the rest of the imagery. The lights, sound, and projection were being powered using Show Power, which were generators outside the venue. When they shutdown, we shutdown. No all-nighters for us!

As far as imagery was concerned, the highlight of my professional life was this gig. Before coming to California, I had created a 3D animation of what I thought a Rothko painting would look like if it were in 3D space. I made blocks of color typical in his popular paintings into 3D blocks that looked 2D when viewed straight on, but broken into 3D objects in different planes of depth when seen from the side. I created an animation that turned around these blocks in 360 degrees so it could loop easily. I then ran an effect over it so it looked similar to a painting. JB loved it and assigned it to “Outlaw Torn.” It would be the only thing on the screen during the song. I was ecstatic. Why? “Outlaw Torn” is my all-time favorite song – to me, it is the greatest song that’s ever been recorded (BTW, #2 is Creedence’s “Have You Ever Seen the Rain” and #3 is S&G “Sound of Silence”). I could post about it all by itself, but I simply love the song and the fact that my favorite work for the gig was assigned to my favorite song put me in a place I have not been since or expect to ever be professionally. The S&M version of “Outlaw Torn” is the ultimate music experience and I highly recommend listening to it as loud as you can – preferably with a 5.1 system and the official DVD set to 5.1.

We had some other things going on the screen. We had a live camera backstage shooting at the wonderful texture on the back wall of the theater; we had access to the 5 cameras shooting the concert under a different vendor (the main cameras for the DVD); and we had one of our own cameras (run by the same guy who grabbed the liquid light show – aka Schmaba – awesome and talented guy, who happened to give me some great advice that I can share in a later post). We used an Abakus DVEous to warp and twist live video. We used a video switcher to key warped video into the dark areas of animated Rothko-esque imagery. We threw the kitchen sink at the screen to be as creative as possible in a live concert environment in a short amount of time. The DVD video director, Wayne Isham, was not always thrilled with what we and JB came up with, but we did not seem to care. It was funny that we were working with the hottest video director of the era, but he seemed to be 2 steps behind us. Unfortunately, he got the final say on the DVD edit, so our screen is not featured as much as it could have been if we were more collaborative with him.

All of this imagery had to be coordinated for the concert. Jim and I were big Metallica fans, but Scott, the director, who ran the switcher, which controlled the screen, was not as familiar with the music – he was our boss, but he was at a loss. Jim was running the playback of the animations using a Media 100 video editor (crazy compared to the tools we have nowadays for realtime playback) so he had a job. Our video engineer, Barry Otto, was running the Abekus. I was done, or so I thought. By Monday evening it seemed I needed to be the assistant video director (the Stage Manager for you Theater types). I ran the show. I called cues for Jim, Barry, and Scott, so all aspects of the screen could be properly coordinated with the live show. I had a lot of experience doing similar work as a light or sound board operator or stage manager in theatrical productions so I was was equipped to handle it, however, I was truly scared by the scale and importance of the project. “Outlaw Torn” was the only time during the show I got to rest and enjoy the show  – I was fine with that.

There were some other cool moments. Wednesday night after the show, Metallica and the Symphony recorded some singles. They turned the P.A. around and played and recorded “Master of Puppets,” which we stayed around to hear. The bosses wanted to go back to the hotel so we missed the other recordings they did. There were also some weird moments, such as yelling at Wayne Isham, being yelled at by the Bass tech, and hoping we were really pulling this off.

Some images from the show. These are from “Outlaw Torn.” Notice the unity of the stage lighting and screen imagery;)

vlcsnap-2014-02-25-20h36m16s8 vlcsnap-2014-02-25-20h36m46s79 vlcsnap-2014-02-25-20h38m41s192

I have a Metallica part II, which was their summer Sanitarium Tour in 2000. I have some thoughts about catering at the S&M gig, but I think I will hold them for part II.

First VFX project coming from a Premiere Pro edit

I’ve done several VFX projects with my colleagues at UCA. Until now they were edited with FCP 7 and we had lots of cool options for getting clips to VFX artists. Mike Gunter just picture locked his latest film and he used Premiere Pro CC (latest update) to edit. The film is just under 6 minutes long and about 80% of the shots require greenscreen work. The shots are of characters in a car interior and my students will replace the greenscreen outside the car with some background footage that was shot specifically for the film.

It is my job to separate and assign each of the shots to my students. I also need to make sure the shots go back into the edit with no issues. I also do QC on the shots, but that’s another post.

Givens:

  • Footage is mostly QT ProRes recorded with a Blackmagic HyperDeck Shuttle external recorder. The film was shot with a Magic Lantern hacked Canon 5DIII that gave a clean 8-bit signal out of the hdmi port to the Shuttle.
  • Picture edit in Premiere Pro CC (PPro)
  • VFX work in After Effects (AE)
  • Color grade will be done with DaVinci Resolve 10
  • No decision made on which tool the online will happen in

Problem: Getting each shot to my students and returning their rendered work to the edit. Mike would like to have handles on the shots just in case he wants to tweak a frame here or there or add a transition.

Possible Workflows:

  • Premiere Pro to After Effects via Dynamic Link: Work is done in AE on original footage and saved, but not rendered. All work is rendered as online in PPro itself.
    • Pros
      • Creates an AE project with most settings done
      • References original footage – no new footage needed for VFX work
      • Maintains in and out points (I/O points) in that referenced footage
      • No need to store new copies (VFX finished) files until the end
    • Cons
      • Dynamic Link – it has burned me before by not properly rendering the AE project. Not with CC at this point though. Heard an AE developer, in an fxguide interview, mention that making DL rock solid was not a high-priority so I stay skeptical.
      • I have to manually make this happen for over 40 shots – yuk!
      • Related to DL issues – troubleshooting rendering problems will be a nightmare including making sure no AE assets get lost (hasn’t happened yet, but it is related to the QC job I mentioned above)
  • Import Premiere Pro project into After Effects: Work is done in AE on original footage and then rendered as new footage that is re-linked in PPro
    • Pros
      • More stable than Dynamic Link
      • No new footage is needed for VFX work
      • Maintains I/O points
      • Potentially online in AE (some people do it, but yuk!)
    • Cons
      • No native method to export layers as new project files with their own compositions, which means separating the jobs for the students would be a nightmare. I looked into scripting this, which seems very doable, but there are other cons mentioned next…
      • PPro merged clips are treated as compositions and are NOT(!) layers in the main Sequence. This means that merged clips aren’t in the edit anymore – deal breaker if you use merged clips, which Mike did (not his fault, it’s what you do when you record off-camera audio).
      • Must render complete clip (outside I/O points) so the clip can be properly re-linked back to the PPro Sequence/Project
  • Premiere Pro to DaVinci Resolve: Work is done in AE using new trimmed clips and then rendered as footage that is re-linked in Resolve.
    • Pros
      • Conforms sequence – removes everything not used by the sequence. More “industry-standard” practice
      • Export each edited clip as a new piece of footage with minimal handles (48 frames for instance). Side note: Mike and the DP confessed to forgetting to hit stop on the external recorder a few times during production. In extreme cases, about 4 seconds are used from a 3 minute clip. Creating a new 3-minute clip so it can properly link back to the original edit sucks for the VFX artists and me. Though making new clips takes up space, they will be smaller clips and the originals can be trashed from the working “hard drive.”
      • Can send the edit with new footage back to PPro via XML, OR, online in Resolve!
    • Cons
      • New footage – gotta make space until everything is run through Resolve, then remove old footage if space is needed on working drive (still have several backups)
      • XML export in PPro is worthless with merged clips (I’m seeing a trend). No matter what I try, it only exports one of the merged clips as a sequence rather than the actual sequence in the project (there is only one since I reduced the project to one sequence using the Project Manager).
      • AAF export to Resolve worked as long as you ignore all of the errors during export.
      • Not a Con, but something I read, but have not tried – it seems an EDL might work best with merged clips to Resolve. Should try this to see.
      • Audio edits are not supported in Resolve. Consider export audio separately, or don’t worry about them since your sound designer will tweak that anyway…
  • Premiere Pro to Final Cut Pro 7: XML to FCP 7 and then use my old batch export workflow and re-link. Not future-proof – won’t even try it, but it is more common than it should be.

Conclusions (for now).

Dynamic Link is for the occasional VFX shot and worth keeping in mind, but too cumbersome when most of the edit has to go through AE. Working outside the Adobe ecosystem is hard. People complain about limited capability in FCPX, but it is just as limited if the tools are there, but not robust enough to use (merged clips become comps in AE and with AAF).

For this project I will run all of the footage through Resolve to prep for the students. They will have to create their own AE projects, but that is easy and it is good practice. They will also render their own projects after they get feedback a couple of times.

The next miracle is doing this same thing, but with R3D footage with the graduate VFX class. Until I see the final list of shots needing VFX, I think we will Dynamic Link them if it’s possible with R3D footage. If the shot count is high enough then we may conform the edit in Red Cine X or Resolve and follow a similar workflow to Mike’s film.

I contemplated trying conforming in Smoke instead of Resolve, but for now I will stick with tools that are available in our computer labs.

Last, here’s a shot of the PPro sequence in AE. There should be a LOT more layers, but the merged clips were turned into comps that were not layered into the main comp based on the original sequence. Also see the long clips with only a few frames used in the actual edit – I prefer to trim those down.

PPRo In AE Fail