3D Printing – my journey begins

I’ve been interested in 3D printing for quite a while. I was introduced to it back in the early 2000s when there were only two printers at Purdue. The stereolithography printer was over $100K and made some incredible parts (I only saw the machine and its output – I never did anything with it). The desktop printer was $30K and not near as cool, but I got a chance to work with it a little.

Fast forward 11 years or so and 3D printing is possible at much lower prices. I started paying attention to the products coming out or getting going on Kickstarter. The prices were a bit high, but there were some lower-cost printers on the horizon. I followed Makibox for over a year, waiting for the A6 to see the light of day. It was only $350 shipped and the output looked good. Once it started to ship I checked out the forums to see what owners were saying. I was not impressed.

I decided to keep researching and came up with some ideas for how to consider choosing a printer:

  1. Price – Printers are expensive – I have to be able to afford it, but you tend to get what you pay for.
  2. Printing materials – The two most popular materials are ABS and PLA plastics. They both have strengths and weaknesses I won’t get into in this post. PLA seems to be the most popular now. It does not require as many features on the printer as ABS. It’s also possible to print nylon, PET, wood-based plastics, and other thermoplastics.
  3. Open source vs. proprietary software and control – Most desktop printers are derived from reprap open source printers, but some are completely turn-key from a vendor. The reprap-based printers can use several different software available online and can be tweaked by the user, but require more technical savvy. Turn-key printers tend to be all proprietary, which may restrict them to certain materials and capabilities, but may be considered easier to use.
  4. Print size – How big of an object can the printer make?
  5. User community and user comments – Read what users say about the product. How easy is it to use? How does it hold up? etc…
  6. Vendor/Manufacturer – What do users think of them? How long have they been around? Where are they in the world? What kind of support do they offer?
  7. Number of extruders – Most printers have one extruder, which means it can print with one material and color at a time. Some have two extruders, which means that they can print with either two different colors or material types (or both of course).

After researching for months I came up with what I believe is the perfect combination of the above – the Makergear M2.

  1. Price is on the higher side of the average for desktop printers. I purchased the “kit” version, which lowered the price considerably.
  2. Prints ABS and PLA. ABS requires a heated bed, which is not available on many printers (including Makerbot’s newest printers).
  3. Uses open source electronics and software. They have an option for commercial software if you want an easier start up experience. This means that using other materials is a possibility. Each material has different extruding needs (temperature, speed…), which can be tweaked with software choices.
  4. Large print area – 8″x10″x8″
  5. I was really impressed by what I was reading on the forums and reviews.
  6. Makergear is in Ohio and most of the hardware is made in that region!
  7. Single extruder, which is fine for now. Makergear has said that a dual extruder is in the works and will be field upgradable.

Once I opened up the box I realized that the “kit” should really be called “partially assembled.” I had watched videos online and read stories by kit buyers and I was expecting a lot of work. Makergear assembled the hard parts and it only took me ~5 hours to completely assemble it – taking my time and double-checking every step. The next day I calibrated it (leveled bed and set z-stop) and sent a test print. It worked great!

Unpacking the M2

Unpacking the M2

M2 Printing Test Bracelet

M2 Printing Test Bracelet

 

 

 

 

 

 

 

 

 

Next, I thought I would do something a little harder – Yoda. I did not really know what I was doing, but I jumped in. I decided to go with Repetier Host for software and use some M2 presets I found online. Unfortunately, those presets were not good for Yoda. It was a disaster – I knew it would happen because I am learning, but ouch! It was printing nearly a solid object and half way through it slid off from the center of the bed, but kept trying to print.

M2 Bracelet and Bad Yoda Print

M2 Bracelet and Bad Yoda Print

The next day I did some more research and figured out how to print Yoda hollow and make sure the PLA fan turned on at the right times (to handle the overhangs better – like his chin). I also decided to go with painter’s tape on the bed rather than heating it. The print was fast and amazing! I’ve printed a few of other things since Yoda and everything has been great.

Yoda Begins on M2

Failed Yoda Begins on M2

Yoda in Process on M2

Good Yoda in Process on M2

Finished M2 Printed Yoda

Finished M2 Printed Yoda


I’ve made two simple objects of my own design so far and printed them with no issues. In the next posts on 3D printing I’ll mention modeling software and the process involved in printing one’s designs.

Advertisements

Post Pipeline for “The Long Drive Goodnight”

I was reading this and realized that I never did a followup of the post pipeline for The Long Drive Goodnight.

The Long Drive Goodnight was produced, directed, and edited by my colleague, Mike Gunter. He used a student cast and crew to shoot and help with the post and I supervised the visual effects and post pipeline.

Please read this before continuing.

Mike cut the film with Premiere Pro CC. The majority of the shots required greenscreen removal and replacement backgrounds (visual effects shots). Previously, I mentioned that I wanted to trim the clips to make it easier to distribute the shots to my students.

Making New Footage and Checking the Edit

The edit was exported as AAF and imported into Blackmagic DaVinci Resolve 10. In Resolve, I then rendered the entire timeline as new individual clips (ProRes 422 HQ) with unique names for each shot, 36 frame handles, and an XML (old FCP style). All was well until Resolve tried to do anything with the sound effects. It did not like the mp3s Mike used and it stopped the rendering process. It took a little while to realize that was what was happening so I finally deleted them and it finished the process. I needed it to finish so it would also create the XML based on the newly rendered footage. The XML was imported into the same PPro project as the original edit. I then copied and pasted the new version in an upper track of a copy of the original sequence. Next, I set the opacity of the upper track to 50% by copying and pasting attributes to all clips. Then, I went through each cut to make sure they all aligned. The only issues were in the shots that had speed changes – the in and out points and the speed was off. Resolve gives different options for dealing with speed changes on import and I probably could have figured out where it should be set for an AAF coming from PPro, but I was too far gone for that. Instead, I just fixed the I/O and speed %s.

With everything looking good, I then assigned the vfx shots to my students.

Visual Effects

My students did the lion’s share of the effects using After Effects, which generally needed; greenscreen extraction and some cleanup with masks, and spill suppression; comp a background plate and potentially some stabilization; and comp window grunge. The hardest and/or left-over shots were done by me, which ended up being about 7 or so. We did not do any color matching between foreground and background plates.

Instead, the students exported two clips. The first was the foreground combined with the window grunge with alpha exported as a ProRes 4444. The second was the background as a ProRes 422 HQ Quicktime. We all had to make sure that the final renders of the foregrounds had the exact same name as the greenscreen footage (shot clip rendered from Resolve and checked in PPro previously) so they would go back into the edit properly.

Final Compositing and Color Matching

Foreground + Background = Composite in Resolve

Foreground + Background = Composite in Resolve

I exported the checked edit from Premiere Pro via XML (worked fine since it had no merged clips) and imported it into a new Resolve project. Next, I brought in the background clips. I was hoping that the editing tools in Resolve would make it easy enough to add the background clips on a new track under the foreground clips. It was doable, but Resolve’s viewer panel had no keyboard shortcuts for making In and Out points that I could figure out in the time I had set aside. I needed to put I/O points at 36 frames from the head and tail of each clip.

So I went back to PPro and imported the background footage and went shot by shot, adding the background clips. Process = Double-click the clip from the project browser so it goes to the viewer > press “home” key > “+” > “36” > “I” > “end” > “-” > “36” > “O” > drag to sequence. Rinse and repeat for each clip. I also checked to make sure each foreground shot aligned with the original edit. In 3 cases, my students had changed the length of the clip because their AE work area was longer than the clip. I just changed I/O points rather than re-rendering.

Had there been any more shots, I would have automated that somehow via text editing an XML, or writing some tool.

BTW, I checked to make sure the composite was going to work while in PPro. Turns out that the alpha mode needed to be changed from premultiplied to straight (or the other way – can’t remember), but Premiere Pro does not have the option to change alpha type. The docs said they leave it up to the writer of the importer. I thought that was BS for sure.

Anyway, I exported to XML again and got the sequence into Resolve. Resolve has the tools to change alpha mode and all was well. Michael Xiques, one of our students, then took the project and color graded the foreground and background of each shot using the regular tools Resolve has to offer. All was well.

Final Post

Mike did his titles in PPro and exported them as footage to bring into Resolve. Tyler Hutchins, another one of our students, did the final audio mix. By this point everything seemed to be under control so I wasn’t involved much. Mike told me later that they got the mixed audio into Resolve and everything seemed fine in the interface, but when they rendered, the audio and video went out of sync. I never got a chance to see what could have caused that. They exported the picture from Resolve and married it with the audio in PPro and everything was fine when exported from there.

Conclusions

Learning curve

The edit was Mike’s first with Premiere Pro and the first for me to deal with in a post pipeline that involved multiple people. I ended up putting a lot of hours into how best to move it around between PPro, AE, and Resolve. Mike and Michael also had to learn Resolve essentially from scratch. The biggest issue they ran into was wrapping their head around the idea that there was no project file in the same way they have project files for PPro, AE, FCP7, etc. They couldn’t move their work to different computers at will like they were used to. We had to learn to export a project and re-import it on another computer.

We also learned that Resolve needs good hardware to run. Our newest lab of 27″ i7 iMacs ran everything just fine, but as we moved the project to other older hardware, Resolve became unusable.

RTFM Moments

When I prepared the project for Michael to pick up and work with from his own hard drive I had to learn Resolve’s version of re-connecting media. It does not do it like a typical NLE. Instead of just telling it where to look for footage in any arbitrary folder, it expects the folder/directory structure to never change. Once I figured out what it wanted, I was able to move the project to different drives as needed. I hope that v. 11 offers more than “change source folder” or whatever they call it now – they’ve changed the tool’s name a couple of times without actually changing the functionality.

Conforming a new version of an edit to an existing graded edit is not as easy as Apple’s Color IMHO. Once I really get my head around the options available to move the color grade from one version of an edit to a new version in Resolve then all will be well. Luckily, the only time we ran into the issue of work started on one edited sequence needing to be moved to another, only a few shots had actually been colored. We decided to just save stills on those shots and then re-apply the grade in the newer edited sequence.

Going Forward

I am really excited about the upcoming version 11 of Resolve. After watching some of the videos from Blackmagic’s website, I feel like most of the issues I had are now taken care of in the new version. It’s supposed to be available sometime this month (June ’14). For projects that several artists have their hands on, it makes a lot of sense to finish it with Resolve. Depending on the NLE features coming up, it may make a lot of sense to start with Resolve… Now if they will only let you turn off the clip thumbnails in the Edit tool.