LRHS Projection Mapping – Animation Experiments

I animate and render what the projector will playback and then project that animation back on the facade model, that has a similar texture to the real building, to simulate what it will look like on site.

The first animation has three statues moving their arms. After starting the rendering process I went for a walk (for those new to the blog that’s what the name means render + walk because you can’t do much else on the computer while rendering). It occurred to me that when this is projected onto the building, the statue arms will be quite a distance from the actual statue due to the facade’s depth. This isn’t much of an issue when looking at the building from front-center especially near the projector, but off-axis I felt like it may suck.

So I rendered a view off-axis to check.

I didn’t like it for two reasons. One, my original hypothesis was correct and the arms are pretty far away. This is an issue for about a third of the crowd thanks to the trees that force the audience towards the center of the viewing area, but I still don’t like it. The other reason is that any illumination on the actual statues makes them stand out as statues so I feel like we won’t be able to really remove them like I hoped. The side view does look cool even without illumination on the sides of the pillars and arches. It’s possible to project onto them too, but beyond this project’s budget.

So I created a new animation. This is better in terms of making it so the statues are seen when I want them to be seen. However, there is a moment when I have the statue “niches” rise up behind them, but it’s too late, they can already be seen. The lesson is that as parts of the building are highlighted or animated they need a strong silhouette – subtlety will be lost as soon as there is any light on them.

I’ve left the exterior lanterns, doors, and windows their natural color, which is dark, on the projection model for now. It is our goal to cover those with a material that reflects light better.

Here’s a fun experiment… A little bit of depth shading on a blueprint.

blueprint

Geek stuff warning

When I was preparing the model to simulate the projection on the building I found that some of the proportions of the statues were off by too much to let go. Thanks to some new photos I took of the building I had more modeling work to do to get it right. I had to spend some time moving parts of the statues around until they properly aligned with the real statues. I also tweaked the building, windows, and doors a little. Was a one step forward, two steps back moment, but it looks a lot better now and I have a lot more confidence in the projection.

The animations above were 750 frames each. Rendering them and then rendering the projection simulation was 4500 frames. Plus some re-rendering sections after deciding to make some tweaks. I use two computers to render. One is a Retina iMac and the other is a custom-built Linux/Windows PC. The iMac renders using its CPU (4 CPU cores/8 hyperthreaded cores) and the PC renders using two Nvidia GPUs. In some cases the PC can render four or more frames for every one the iMac can render because the GPU acceleration is so great.

Unfortunately/fortunately the Blender Cycles developers have been working hard on the GPU acceleration including, BTW, developers at AMD working on making it so Cycles is not limited to Nvidia GPUs. I say unfortunately because on one of the animations I found the PC Cycles render was crashing every 40 frames or so. It’s a sad morning when you see that the faster computer hadn’t been rendering for the last 6+ hours…

I don’t have time to troubleshoot the issue. It’s a mix of Blender/Cycles and Nvidia software and it’s not that bad in the grand scheme of things. To deal with it I decided to dust off a python script I wrote several years ago for a compute cluster we had at UCA. It created a job script for the distributed computing software. I was able to simplify it quite a bit and have it spit out a shell script (like a batch file for you Windows weirdos) that I could run so that Blender would render each frame as a new job rather than one job rendering all of the frames. Essentially it changes this one line that I manually type in a terminal:

blender -b blendfile.blend -a
(this tells blender to start without a UI to save resources and then render the animation based on the project’s settings)

To this listed in a shell script that I start by typing render.sh:

blender -b blendfile.blend -f 1
(render frame 1 based on the project’s settings and then close Blender)
blender -b blendfile.blend -f 2 (then render frame 2)
blender -b blendfile.blend -f 3 (then render frame 3)

Works like a charm. I could make the python script do a lot more tricks, but for now this is nice.

Last, Blender has a simple method of allowing multiple computers to render the same animation without using a render management application. Set the output to not overwrite and to make placeholders. A computer will look for frame 1 in the folder where the rendered images are saved (the output folder) and if it sees it then it will look for frame 2, etc. When it finds a frame that hasn’t been rendered it will create placeholder image, render, and replace the placeholder with the finished image. Each computer can claim a frame as they go, which is nice since one computer renders so much faster than the other. After Effects works this way too if you use multiple computers to render.

Since I’m not using a management system there is no check to make sure a frame actually gets rendered properly so I also wrote a python script back in the day that looks for frames with zero bytes to tell me if there were some bad frames. I might automate that with my other script, but I don’t want to dedicate the time to that right now. The macOS Finder does a nice job of listing “zero bytes,” which stands out in a list, or listing by size, so I’ve manually deleted bad frames too. To render those bad ones after deleting I just run the first command with the “-a” to find missing frames and render.

LRCHS – Projection Mapping – 1st Post

The 90th anniversary of the opening of the Little Rock Central High School building and the 60th anniversary of the Desegregation Crisis are coming September 18-25, 2017. It will be a week of activities that commemorates the anniversaries and culminates in an event that features a projection mapped animation on the façade of the high school building.

This first blog post is about a major milestone for the animation, which is a completed virtual 3D model of the facade including its four statues. Now that the model is complete we can finally get to work. The majority of the animation we create will be based on the architectural structure of the facade. I can’t believe February is almost over! It took me over a week longer than I expected to finish this phase of the project due to distractions including an illness that caused horrible headaches as well as external issues and projects and some personal goals beyond the projection mapping project. Hopefully the headaches are past – I can manage the rest.

Here’s the basic model:

4statues

We can add lighting that can make it appear as if we’ve hung actual lights near the building:

spotlights

We can also play around (this is just a test and not final imagery):

lightjade

And add stuff:

1927

Here’s what it should look like at the campus. We intend to add some lighting around the central facade as well.

projectiontest

The Facade

The limestone part of the high school’s main entry has several nice 1920s Art Deco details and is sculptural in nature with deep set doors and windows and jutting pedestals for the four statues. I still need to add the letters for the statues. We will hopefully be able to temporarily cover the windows and doors so they won’t be so dark. We will also need to cover the lanterns so they will reflect the projections.

dsc00432

Ambition, Personality, Opportunity, and Preparation

When facing the building the four statues from left to right are Ambition (male), Personality (female), Opportunity (female), and Preparation (male).

I’ve been told that the four statues were “ordered from a catalog” and not unique to the building project. Their body styles are reminiscent of Michelangelo sculptures with their long muscular arms and Greek facial features. Preparation must have been the sculptor’s version of David – see his contrapposto stance, physique, lowered right arm (holding a scroll in this case), raised left arm holding a book instead of a sling, and a left-facing gaze.

ch-interior_110 512px-27david27_by_michelangelo_jbu0001

Their dress is based on ancient Greek Chiton. The sculptural style is “wet drape” where the cloth clings to the skin to reveal the figure’s body underneath. This is most obvious in Preparation with his torso that practically looks bare, and you can see it in Opportunity as well. I modeled these statues by starting with nudes so I could get the wet drape look right.

I think later blog posts will go on another website dedicated to this project. Geeky stuff will stay on this blog though.

Geek Stuff (most of you will want to skip this)

I modeled the facade by building basic geometric shapes and aligning them to a photograph I took last summer. I actually got most of this model finished by last fall. In January I added the smaller details and lanterns.

The statues were very time consuming and I knew they would be… I downloaded a few nude “base models” from Blendswap, which are designed to be a starting place for creating a character. For the females, I used the body of one and the hands and head of another. After splicing them together I pushed and pulled and extruded faces, edges, and vertices to make them match the sculpture. I also used sculpting tools to smooth and guide areas of the model. The models are considered low-poly, which makes them easy to animate and handle in the 3D software. When they are rendered they are smoothed using Pixar’s subdivision surface technology. It turns a blocky mess of polygons into flowing garments.

For the capes I essentially started with a line and extruded it and moved it to create the overlapping folds. For smaller details I just cut the larger polygonal faces into smaller ones that I could then push, pull, and sculpt into their final form.

Once a model seemed ready to go I aligned it with the main photo of the facade. I had closeups of the statues to do most of the work, but since the photos were taken from below, the proportions were not accurate so aligning with the main photo was key to getting them the correct overall size. Because of the proportion issues and a number of other things, I modeled them just looking at my photos rather than trying to align them to photos in the 3D viewport, which is common for character design.

While modeling, the virtual statue is standing in a T-pose. I used a T-pose because we will most-likely apply some custom motion capture animation and our motion capture system (Perception Neuron) requires a T-pose to start. Another common starting point for a character model is an A-pose, which is more relaxed, but not a good idea for our purposes.

After getting the proportions correct I added a skeleton to the model. The skeleton is based on the needs of the motion capture system. The model is bound to the skeleton so whenever I move a bone, the model with deform with it. I used the bones to pose the model to match the statues. I actually animated the movement so I could go back to the T-pose easily as well as test the model deformations as the bones moved. Some of the dress is not driven by the skeleton at the moment. That will come later via cloth simulations.

opportunityposing

I modeled the statues this way because I knew we would be animating them and they needed a structure that would support animation. A more accurate alternative to modeling by eye would have been to scan the actual sculptures. Scanning could be done via LIDAR, but would have been prohibitively expensive. Or, it can be done with lots of photographs from multiple angles via photogrammetry. Shooting the sculptures with a drone and extracting frames from the video would have been a way to get the images needed.

The upside to scanning would be a very accurate model, but there are downsides. One is that the scan would have to be retopologized, which can be time intensive, to make it animatable. Another is that the models would not have a backside and the arms would be stuck to the bodies so they would need hand modeling to create the back and make the arms free. I would have been up for these things had they been scanned last fall. Unfortunately they are 22 feet above the ground so logistically it is not a trivial issue to get to them.

From here it is a matter of lighting, creating cool surface materials, animating the statues, opening the doors, or whatever else we come up with. Even things that don’t directly change the facade, such as showing a photo, will be rendered against the virtual facade so the photo will appear to interact with the building.

Blender

screenshot

I used Blender to do all of this work. It is just a joy to use. Some things that came in handy (these aren’t necessarily unique to Blender BTW):

  • Use photos as a background in the camera viewport to help create a 3D environment that is similar to the size of the actual building
  • Changed one of my 3D panels into an image viewer so I could have a photo of a statue up at all times.
  • The Shift Key – I use a Wacom Intuos 4 Medium when working with graphics software. It has a bad habit of moving during a click or not really making the mark you tried because it was so small. When changing a parameter in Blender (practically no matter what it is), you can hold down the Shift Key while doing it and it will increase the accuracy of the parameter by not allowing it to change drastically no matter how much you move the stylus. I can make big movements to do small changes. BTW, some graphics programs do have a similar function, just not all…
  • Matcaps – haven’t really used them before, but they make modeling organic forms much easier. They allow you to customize how the model is shaded in the viewport so you can see the curved surfaces easier.
  • Proportional Editing – Used when moving a vertex or small group of vertices and wanting surrounding vertices to move with them, but not as much. Super helpful when making proportion changes or needing to move parts of the model around to accommodate the posed body. Especially useful is the “Connected” mode where it will only move vertices connected to the one you are moving rather than ones that are just nearby. You can also change the falloff to control how the other non-selected vertices will change. BTW, this works on more than just vertices, just using that as an example.
  • Subdivision Surfaces – Blender can show the subd effect while editing the model either by showing the base model and the smoothing separately or by bending the base model’s edges along the surface of the smoothed model. This really helps know how the changes of the low resolution model will change the smoothed model.
  • Solidify modifier – I made the capes a single polygon thickness and used this modifier to give it dimensional thickness. When sending the models out to Jim and Jonathan, who use Cinema4D and Maya, I will “Apply” this effect to make the geometry permanent.
  • Cycles with two GPUs – it’s so fast! To do test renderings and make the images in this blog post it is amazing how fast Cycles can be. The images here took about a minute and a half to render each one. It’s also crazy easy to make objects into light sources. I do most of the work on my iMac and then switch over to my Linux computer for rendering.

Ergonomics – An Update

I’m entering my fourth year of using standing desks and this winter and spring will be the ultimate test. I’m on sabbatical leave and will be working on an intense animation project so I’ll be at my desk a lot.

During a regular semester my day is broken up with:

  • Some office time – at my standing desk
  • Meetings – sitting
  • Class – standing mostly, but fairly mobile – I walk around in the classroom
  • A daily walk (or more if I have meetings in another building)
  • At home in the evenings a mix of movement and sitting on the couch

During the summer (especially the last two), I stay away from my computer as much as possible. Instead I’m outside working or in my garage/workshop building something. Standing most of the day, but with a lot of movement.

So I’m not really at the desk 8 hours a day. For the next few months, however, I will be doing work on the computer most of the week’s working hours. There are a few key things to making the standing desk work:

  • Wear sensible shoes. In my last post I mentioned some running shoes I liked. Over the last year I’ve been wearing some Army-style boots that are super comfortable with thick soft soles and a gel insert.
  • Get a good anti-fatigue mat. I think those who try standing and then give it up quickly probably don’t take this part seriously. I’ve used kitchen-style mats and they suck. They fail by sinking within a few seconds and I feel like I’m just standing on the floor. A couple of years ago I switched to industrial mats and they are great. They don’t sink, but instead give you a springy feel. I’ve got two full size mats linked together to go the length of my desk in my home studio and I cut one in half with a utility knife for my office (gave the other half to my wife for her office).
  • Make sure your desk is the right height. This is where I’ve had a couple of problems:
    • When my office desk was too low I found myself putting pressure on my wrists, which over time caused me enough discomfort that I do stretches and try my best not to bend my wrist much. For instance, I do pushups on my knuckles now because it hurts to do them with my palms flat to the floor.
    • My home studio desk is currently too low, but I keep my keyboard arranged so it is behind my drawing tablet so I am not putting pressure on my wrists. Instead I get a little lower back discomfort because I am slouching or compressing my upper body. The desk will get raised ½ inch this weekend and all will be well. I cut small squares of MDF and slide them under the feet.
    • Why is my desk the wrong height? When I changed shoes and mat I got taller, but my desk didn’t. If you get a desk or desktop device that is adjustable then you are good to go. My next desk design will be adjustable.

In the summer of 2015 I built a standing sewing table for my wife, who is a costume designer and expert seamstress. She likes it much better than sitting and hunching over the sewing machine (and cat stuff fits underneath).

standingsewingtable

I’ve seen a few desktop stand/sit devices, but I recently learned about one coming up from a company called Standable. Their device looks pretty cool. What I like:

  • Simple elegant design
  • Easy to adjust the height with no motors and weird looking mechanical parts, which can be an eyesore on desktop devices compared to whole desks.
  • Designed to separate the screen and keyboard of a notebook computer (something I do at work), which is key to working healthier with a notebook computer.
  • I’m assuming the price will be accessible due to its lack of motors and the like – and low-cost seems to be part of their mission.
  • Most of my friends and colleagues have notebook computers as their office computers so I see this as being something they could try without being intimidated by some of the industrial looking devices that are out there

My concerns:

  • In the looping video the keyboard shelf seems to wobble as he types. That’s one of my many pet peeves (I have too many). I’ve removed sliding keyboard trays and other cool looking devices from desks because I thought they moved too much. Larger motorized devices are probably stiffer. Other people might dig it since it is almost like a suspension system and shock absorber.
  • Curious about maximum height of the user so he can still be in the ergo range for viewing the screen. Not an issue for me since I am under 6’ tall, but I’ve got friends who might be tall enough that this device may not be for them. We’ll know when the Kickstarter campaign is underway.

Standable has a nice web page on work health tips and highlights a number of ways that you can use Kickstarter projects to be healthier at work – including, but not limited to, standing.

They also have a guide for getting healthier at work for both sitting and standing (scroll down on their main page).

Finally, some more info about the Standable project:

team-pic-5_fb

We know that the best way to stand is with your eyes straight ahead and your arms at a 90 degree angle to your keyboard. Our goal was to create a desk that accommodates people of all heights, shapes and sizes. Most affordable solutions have a single tray, causing T-Rex syndrome (arms too close to head) or worse, the Mummy syndrome- a big old pain in your neck from looking down at where your screen is. Those that do have a second tray, don’t allow you to customize the distance between where you are typing and looking. For all the good that standing can do, we just couldn’t understand how no one took it across the finish line and made it easy to adjust based on where your arms fall.

Standable is the first desk that is designed to give you the most natural stance while you stand at work. The key is this – eyes straight ahead and elbows at a 90 degree angle. We created a 2 shelf system that does not require any expensive electronics or complicated reconfiguration to allow your eyes and arms to be in the right place. Once we figured out how to solve for the issue, our designer set out to make it as easy and beautiful as possible.

Other desks assume all bodies are made the same – Standable knows they are not!
Join us…..www.thestandable.com

Radium Girls – Set and Projection Design

It’s been five years since I’ve designed a theatrical production with UCA Theatre. My last design was The Bacchae that was both a set and a projection design project. This time around it’s Radium Girls and again I designed the physical scenery and projected imagery. Radium Girls was directed by my colleague, Chris Fritzges.

About Radium Girls

From wikipedia – “The Radium Girls were female factory workers who contracted radiation poisoning from painting watch dials with self-luminous paint at the United States Radium factory in Orange, New Jersey, around 1917. The women, who had been told the paint was harmless, ingested deadly amounts of radium by licking their paintbrushes to give them a fine point; some also painted their fingernails and teeth with the glowing substance.

Five of the women challenged their employer in a case that established the right of individual workers who contract occupational diseases to sue their employers.”

The play, by D.W. Gregory, tells this story through one of the girls, Grace Fryer, and the president of the U.S. Radium Corporation, Arthur Roeder.

Design Process

The design team, which was made up of myself and theatre faculty and students, met several times to discuss the play including what the story means and what our production goals were. One of the big goals scenically was to include projected imagery. The main reason for projections was that the play has many scenes in different locations and it shouldn’t be staged with a lot of traditional scenery. The thought was that projections could quickly change and help inform the audience of where the different scenes were taking place. Another overall goal was to use scenery that was abstract and allowed for interesting staging, such as multiple platforms at different heights, rather than being realistic looking. Realism is best used for costumes and properties (props) – the things that are closest to the characters want some authenticity, while the playing space can be more abstract or symbolic.

Chris started the process of developing the design by discussing different themes he saw in the story. The following are a few of the larger themes:

  • The Corporation vs. the Worker
  • Masculine vs. Feminine
  • Science vs. Business
  • Fighting time
  • The media

Some visual themes/motifs included clocks, gears, and flowers.

Design Influences

The next step in the process was to do some research. The play’s time period was the 1920s and it recounts actual events so the team, including a student dramaturg (one who is dedicated to researching the play in detail and making his research available to the rest of the team), looked for pictures and articles about the girls, Marie Curie, the U.S. Radium Corporation, radium products and research, and general 1920s trends in clothing, art, and architecture.

I was ultimately most influenced by the work of Hugh Ferriss, the U.S. Radium plant, and timepieces of the era.

U.S. Radium Corporation plant and dial painters

Set Design

Sometimes the set design will just come to me and I quickly work on about three variations of an idea. Not for this play. Instead, I drew sketches of several different ideas and shared them with the design team. The gear and clock influences are a thread throughout the ideas as well as the factory windows, which are referenced in the play. What I was unsure of, was the actual projection surfaces – how integrated should they be into the playing spaces? Also, should we project flat on typical screens or consider other shapes for projection surfaces?

The sketches for the Radium Girls set design

The sketches for the Radium Girls set design

After looking at sketches for a couple of weeks, we decided that we liked three levels of platforms and that they should be round (more feminine shape, clocks, gears, radium symbol). We also worked out the size of each platform. The projection surface ended up taking a little longer, but we finally worked out a projection mapping-oriented wall that had an industrial skyline silhouette at the top. The projection mapping aspect of it was that the screen was not just one plane stretching across the back of the platforms. Instead, it was broken into multiple planes at different angles. Doors through the projection surfaces were the last pieces to go in.

Radium Girls set design front view

Radium Girls set design front view

Radium Girls set design side view

Radium Girls set design side view

We made some last-minute changes to the heights of the platforms for time and cost savings, which ultimately made the set work better. You’ll notice that the doors are above the platforms in the renderings because I was trying to show the change in height as fast as I could… Also, since it had been awhile since I had done a theatrical set, and I was preoccupied by the projected imagery, Shannon Moore, the theatre Technical Director, was instrumental in dealing with some finishing touches like steps and platforms on the upstage side of the set through the doors.

Lastly, I created a painter’s elevation for the platforms. Two platforms were clock faces and the third was a watch/industrial gear.

Painter's elevation

Painter’s elevation

The Set

The Set

pre-show-photo

Pre show and Intermission look

Projections

After the set design was done we moved onto the projection design. I primarily worked with Chris rather than working with the whole design team. The cast also had some input on projection ideas. Chris and I met three times to go through possible imagery for each scene. In the early meetings I discussed imagery ideas that were documentary-like. Imagery would be based on period photos, actual photos of the characters portrayed, newspaper clippings, etc. As we got into discussing the imagery and getting ideas from the cast I felt that the documentary idea wasn’t working with the production style and ideas. The final overall design concept was experiencing each location using either symbolic imagery and/or closeups of objects that would be in that particular location.

In the scenes that were in character’s homes I tried to focus on fireplace mantels because I wanted to feature some style of clock. I included enough clocks that Chris mapped out the time that should be on each clock face starting at 1:00 and going to 11:45.

Doors

The doors didn’t quite work with the concept of closeups and symbolism so I had to come up with a way to change the apparent scale of the spaces depicted in the imagery. During an early rehearsal I attended I saw the problem and came up with a solution almost immediately. I chose to use as much of the screen as possible to do the closeup objects, such as a fireplace mantel, and then change the scale around the door to make it more realistic. I used the scale of the objects and wallpaper pattern to show that if one were to really bend their head around what I created that they could rationalize the different sized objects. I imagined what a door across a room would look like if I were standing close to the fireplace. The fireplace objects would be large in my view and the door small due to its distance away from me.

There were a few places where I tweaked this concept. In the exterior porch of the Roeder home I chose to keep the door in scale, but the house’s siding and eve would be large and out of scale. In the health department I created oversized filing cabinets that dwarf the door. In Grace’s home both doors are used so I couldn’t use the same technique so I made the props, like hanging lights and the mantel clock oversized.

Technical Stuff

Figure 53’s Qlab was used to playback the imagery on an iMac. A VGA signal was sent to two 4000 Lumen projectors at 1920×1080 pixel dimensions. Both projectors got the same image so they were overlapping each other to increase the overall brightness. Qlab was used to warp the image to counteract the warping from the angled screens (projection mapping!).

Blender was used for almost all of the imagery. I used as many pre-modeled objects as possible to save time. There are some recurring scenes with two newspaper reporters and most of those images were created in Photoshop. I used two computers concurrently to stay productive. My main computer is an iMac and I used it to do the modeling and setup lighting and materials in Blender as well as Photoshop work. I then moved over to an older Linux computer I have with two Nvidia graphics cards. Blender’s Cycles renderer can be accelerated using Nvidia cards (AMD cards are almost ready to accelerate too BTW) so I finalized the shading and lighting and did final renders with it.

Oh yeah, I also made some tables for the show

Radium Girls Tables

Radium Girls Tables

Final Thoughts

The show’s overall production quality was amazing. The set, projections, costumes (Designed by Shauna Meador), lighting, props, sound, and performances went together so well. We often talk about a unified production, but sometimes there is one element or another that just doesn’t seem to fit. Not in this case. The show looked really good and was well directed and performed. I can be very critical especially of my own work so I am surprised at how good I feel about the work.

There were problems of course.

  1. I started making the images way too late. I literally did 85% of the images in the last weekend before it opened (it was UCA’s fall break so that last weekend was several days…).
    1. There were 50 images – the most I’ve made for a single show
  2. Because I was so late I didn’t give Chris very many opportunities for feedback. I think he was happy with my work overall, but we should have been able to work together more.
  3. I wanted some of the imagery to be animated, such as spinning newspapers, smoke or dust in the air, subtle movements of objects, etc. There were no animations.
  4. We either started our whole process a little late or took too long to design the set – maybe both. Construction on the set should have started at least a week earlier than it did.
  5. The way I setup the projectors was lame. They were sitting on an angled board in the theater’s 2nd catwalk. Because they were not locked down by any kind of rig they had to be touched every night to make sure they were aligned to each other.
  6. The projectors were not perfectly aligned. Cheap projectors don’t have the tools to do fine adjustments aligning the images of multiple projectors so I got it as close as I could. The image looked out of focus toward the bottom left side (as seen by the audience) and overall had a soft look due to the slight mismatch.
    1. A workaround would have been to send individual signals to the projectors and used Qlab to do the final alignment by giving each projector a custom warping. Instead, I sent a signal to one projector and used the loop-thru to get the signal to the other projector. Sending two signals would have meant using a different computer too.
  7. The projections needed to be brighter. Dr. Greg Blakey, the lighting designer, did a lot of last-minute changes to the lights to try to keep as much illumination off the screen as possible. The only way we could have gone brighter would have been renting a large-venue projector (10K or greater Lumens) and that would have blown the budget unfortunately.

Some of the projections:

The images below are a mix of photos and actual projection images. The photos are untouched jpegs from the camera. When I have more time I’ll work on the raw images. The screen in these photos looks a little darker than it actually was live.

This slideshow requires JavaScript.

Shopsmith ER10 to Drill Press

I restored this 1950s-era Shopsmith ER10 about 12 years ago and since then have used it for a wood lathe, disk sander, drill press, and horizontal boring machine (fancy way of saying drill press on its side).

ER10Whole

Shopsmith ER10

About four years ago I purchased an early 1960s Shopsmith Mark V for $50 and since then the two have been sitting side-by-side in my garage. I bought the Mark V because it has a stronger motor, better speed control and it has a rolling base. This one happens to be a “Goldie,” which was only sold for three years.

Goldie - needs restoration

Goldie – needs restoration

Two Shopsmiths side-by-side is dumb. The idea was to make the ER10 into a drill press and possibly a milling machine, but year after year it wasn’t happening – until now. I was inspired by this found with Pinterest,

but felt a base would be better than trying to attach it to a wall. The new base is 1-1/2″ angle iron welded and a 1-1/2″ pine top (with linseed oil and wax finish).

Now it’s time to get that Goldie running. It really just needs cleaning and repainting.

CuttingMiters

Cutting 45deg angles (miters)

CutPieces

Everything’s cut

LayoutForWelding

Layout for welding

Base

Finished base

PreppingER10

Prep’ing for permanent mount

BaseUpperSupport

Base and upper support

Mounted

Everything’s on and bolted down

AllTogether

All together

3D Printing: an update

It’s past time for a 3D printing update so here it goes:

Pipeline

Though I’ve used a few different apps to model printable objects and control the printer, my go-to pipeline has been: model to scale in Blender > export STL > use Repetier Host/Slicr for layout, slicing, and printer control. I supported a Kickstarter campaign for AstroPrint and have used it once since it shipped, but at the time I did not like the lack of slicing control. I can control whether the object should be hollow or filled (and how much infill), as well as how many layers I want on the walls, bottom, and top of the printed object using Slicr, whereas AstroPrint had presets for quality and that’s it – it’s probably changed by now, but I haven’t looked yet. If I change anything in my pipeline I might layout and slice the model using Repetier Host/Slicr and then upload the gcode to AstroPrint to control the printer. BTW, AstroPrint runs on the Raspberry Pi and can provide updates and some control via a web browser and iPad/iPhone app so there is good reason to consider it.

Blender is great for modeling and even does a good job at modeling to scale. The biggest issue I ran across was getting Blender’s scale to match real scale. It hasn’t been a problem in the past with Sketchup models in Blender (1 foot in Sketchup was 1 foot in Blender), but exported STL’s were 1000 times smaller than they were supposed to be. I learned that I needed to change the world scale to .001 instead of the default 1.0 and all was well. Whenever I need to model something to print, I open up a template blend file I saved and it is ready to go for printable sizes. Until I learned about changing the world scale I was scaling the models in Repetier Host 1000x.

So far I’ve only printed with PLA plastic. I’ve got at least one project I need to do ASAP that I think will work better with ABS.

Lysistrata

The first big project for the MakerGear M2 was phallus costume pieces for UCA Theatre’s production of Lysistrata. I helped design and printed three different looks for the phalluses. They were modeled in Blender, exported as STL files, and then printed. They were larger than the M2’s print volume so I designed them in two pieces with flanges that were big enough to hold some epoxy to adhere them. Theatre students then filled them with polyurethane foam, sanded them, and painted them. We then adhered metal strips and magnets to them so they would stay on the actors, but the actors could take them off on-stage.

lysistrata1

UCA Theatre’s Lysistrata

I had only one issue printing the phalluses. On two print runs, a phallus top section was loosened from the print bed and did not finish printing. Luckily it was only the last 1/8 to 1/4 inch so we just filled them. I believe they lost adhesion with the bed because the extruder (“print head”) was moving rapidly between pieces, which made them shake just enough for the extruder to knock a piece over rather than aligning properly to extrude the next layer of plastic. I lowered the speed the extruder travels when not printing and did not have anymore issues.

Printing old man phallus

Printing old man phallus

Filled_Phalluses

Printed and polyurethane filled

One character had a unique phallus that was larger than the other men’s and was lighted from within. It was printed in four sections so I could have access to the interior. The idea was that the phallus could change color and/or intensity during his scene. Not having much time or experience with LEDs, we purchased a strip of remote controlled LEDs. The expectation was that it would not need any programming or special wiring. It was already battery powered and remotely controlled with control over color, blinking, intensity, etc.

I cut the LED strip into sections that would fit in the phallus and soldered them together with short wires. Once soldered, I taped them together and fit them into the phallus halves. Then epoxied the whole assembly together. “In the lab” it was working well. On stage, however, it wasn’t going to work. The remote control was infrared, which requires line-of-sight to control the LEDs. It was impossible to control the LEDs from off-stage so we scrapped the idea. I learned a lot about LEDs though and am glad I went through the process even though they were not used on-stage.

Printing LED phallus

Printing LED phallus

LED_Phallus_Halves

Halves epoxied

Soldered LED strips

Soldered LED strips

All wired up

All wired up

Into the phallus

Into the phallus

Fully assembled

Fully assembled

 

 

 

 

 

 

 

 

 

 

I also designed and printed a few badges for the play’s police characters.

Printing_BadgePrinted_Badge

Ghost of Christmas Past

A few weeks later I had another 3D printing and LED challenge. I made a necklace for the ghost of Christmas Past for A Christmas Carol, produced and directed by Jim Harris for the University of Arkansas Community College in Morrilton (UACCM). Jim and I decided to model the look off of a Swarovski Christmas ornament and install an array of LEDs in the necklace. The actor would need to turn on the LEDs at a certain moment. I modeled the necklace in Blender in two pieces (front and back). Modeling and printing the necklace was easy at this point. I did print a low-resolution version to make sure the LEDs would fit. Otherwise, no printing issues.

Blender model

Blender model

Printing the star

Printing the star

Having learned lessons in LEDs from the phallus, I grabbed some programmable LEDs and a wearable controller so I could control the whole assembly and programming. The LEDs were Neopixels and the controller was a GEMMA, both from Adafruit. I also got a power supply and switch from Adafruit. The first miracle was getting the LEDs wired. After reading a couple of tutorials on Adafruit’s site, I hooked up the LEDs with some solid-core wire. The solid-core wire was great since it can be bent and shaped as needed and it will keep its shape. The next miracle was programming the GEMMA. It is a simple form of an Arduino and Adafruit has some tutorials to help get one going. I finally got it to light up in a pattern, and then lightly pulse randomly. There is one glitch that I did not have time to fix, but I doubt anyone noticed…

Wiring

Wiring

Wired up

Wired up

Testing

Testing

Coming together

Coming together

Installed

Installed

Switch

Switch

 

 

 

 

Lighted

Lighted

I’m proud of the necklace. My original plan was to disassemble the wiring and re-use the LEDs for another project, however, I love the look of the wiring so I’m working on using it as-is.

 

 

3D Printed LED Necklace from Scott Meador on Vimeo.