Photographing Lightning during The Day or Night with a DSLR

Capturing lightning using a neutral density filter and long exposure

As many of you know, I’m an avid time lapse videographer, and the original purpose of our Flicker Free filter was time lapse. I needed a way to deflicker all those night to day and day to night time lapses. I also love shooting long exposure photos.

As it turns out, this was pretty good experience to have when it came to capturing a VERY rare lightning storm that came through San Francisco late last year.

Living in San Francisco, you’re lucky if you see more than a 3 or 4 lightning bolts a year. Very different from the lightning storms I saw in Florida when I lived there for a year. However, we were treated to a definitely Florida-esqe lightning storm last September. Something like 800 lightning strikes over a few hours. It was a real treat and gave me a chance to try and capture lightning! (in a camera)

The easiest way to capture lightning is just flip your phone’s camera into video mode and point in the direction you hope the lightning is going to be at. Get the video and then pull out a good frame. This works… but video frames are usually heavily compressed and much lower resolution than a photo.

I wanted to use my 30mp Canon 5DmarkIV to get photos, not the iPhone’s mediocre video camera.

Problems, Problems, Problems

To get the 5D to capture lightning, I needed at the very least: 1) a tripod and 2) an intervalometer.

Lightning happens fast. Like, speed of light fast. Until you try and take a picture of it, you don’t realize exactly how fast. If you’re shooting video (30fps), the bolt will happen over 2, maybe 3 frames. if you’ve got a fancy 4K (or 8K!) camera that will shoot 60 or 120fps, that’s not a bad place to start.

However, if you’re trying to take advantage of your 5D’s 6720 × 4480 sensor… you’re not going to get the shot handholding it and manually pressing the shutter. Not going to happen. Cloudy with a chance of boring-ass photos.

So set the camera up on a tripod and plugin in your intervalometer. You can use the built-in, but the external one gives you more options. You want the intervalometer firing as fast as possible but that means only once every second. During the day, that’s not going to work.

Lightning And Daylight

The storm started probably about an hour before sunset. It was cloudy, but there was still a fair amount of light.

At first I thought, “once every second should be good enough”. I was wrong. Basically, the lightning had to happen the exact moment the camera took the picture. Possible, but the odds are against you getting the shot.

As mentioned, I like shooting long exposures. Sometimes at night but often during the day. To achieve this, I have several neutral density filters which I stack on top of each other. They worked great for this. I stacked a couple .9 ND filters on the lens, bringing it down 6 stops. This was enough to let me have a 1/2 sec. shutter speed.

1/2 sec. shutter speed and 1 sec. intervals… I’ve now got a 50/50 chance of getting the shot… assuming the camera is pointed in the direction of the lightning. Luckily it was striking so often, that I could make a good guess as to the area it was going to be in.  As you can see from the above shot, I got some great shots out of it.

Night Lightning

Photographing lightning at night with a Canon 5D

To the naked eye, it was basically night. So with a 2 second exposure and a 2 second interval… as long as the lightning happened where the camera was pointed, I was good to go. (it wasn’t quite night, so with the long exposure you got the last bits of light from sunset) I did not need the neutral density filters as it was pretty dark.

By this point the storm had moved. The lightning was less consistent and a bit further away. So I had to zoom in a bit, reducing the odds of getting the shot. But luck was still with me and I got a few good shots in this direction as well.

I love trying to capture stuff you can’t really see with the naked eye, whether it’s using time lapse to see how clouds move or long exposure to see water flow patterns. Experimenting with capturing lightning was a blast. Just wish we saw more of it here in SF!

So hopefully this gave you some ideas about how to capture lightning, or anything else that moves fast, next time you have a chance!

Artificial Intelligence is The New VR

Couple things stood out to me at NAB.

1) Practically every company exhibiting was talking about A.I.-something.

2) VR seemed to have disappeared from vendor booths.

The last couple years at NAB, VR was everywhere. The Dell booth had a VR simulator, Intel had a VR simulator, booths had Oculuses galore and you could walk away with an armful of cardboard glasses… this year, not so much. Was it there? Sure, but it was hardly to be seen in booths. It felt like the year 3D died. There was a pavilion, there were sessions, but nobody on the show floor was making a big deal about it.

In contrast, it seemed like every vendor was trying to attach A.I. to their name, whether they had an A.I. product or not. Not to mention, Google, Amazon, Microsoft, IBM, Speechmatics and every other big vendor of A.I. cloud services having large booths touting how their A.I. was going to change video production forever.

I’ve talked before about the limitations of A.I. and I think a lot of what was talked about at NAB was really over promising what A.I. can do. We spent most of the six months after releasing Transcriptive 1.0 developing non-A.I. features to help make the A.I. portion of the product more useful. The release were announcing today and the next release coming later this month will focus on getting around A.I. transcripts completely by importing human transcripts.

There’s a lot of value in A.I. It’s an important part of Transcriptive and for a lot use cases it’s awesome. There are just also a lot of limitations.  It’s pretty common that you run into the A.I. equivalent of the Uncanny Valley (a CG character that looks *almost* human but ends up looking unnatural and creepy), where A.I. gets you 95% of the way there but it’s more work than it’s worth to get the final 5%. It’s better to just not use it.

You just have to understand when that 95% makes your life dramatically easier and when it’s like running into a brick wall. Part of my goal, both as a product designer and just talking about it, is to help folks understand where that line in the A.I. sand is.

I also don’t buy into this idea that A.I. is on an exponential curve and it’s just going to get endlessly better, obeying Moore’s law like the speed of processors.

When we first launched Transcriptive, we felt it would replace transcriptionists. We’ve been disabused of that notion. ;-) The reality is that A.I. is making transcriptionists more efficient. Just as we’ve found Transcriptive to be making video editors more efficient. We had a lot of folks coming up to us at NAB this year telling us exactly that. (It was really nice to hear. :-)

However, much of the effectiveness of Transcriptive comes more from the tools that we’ve built around the A.I. portion of the product. Those tools can work with transcripts and metadata regardless of whether they’re A.I. or human generated. So while we’re going to continue to improve what you can do with A.I., we’re also supporting other workflows.

Over the next couple months you’re going to see a lot of announcements about Transcriptive. Our goal is to leverage the parts of A.I. that really work for video production by building tools and features that amplify those strengths, like PowerSearch our new panel for searching all the metadata in your Premiere project, and build bridges to other technology that works better in other areas, such as importing human created transcripts.

Should be a fun couple months, stay tuned! btw… if you’re interested in joining the PowerSearch beta, just email us at cs@nulldigitalanarchy.com.

Addendum: Just to be clear, in one way A.I. is definitely NOT VR. It’s actually useful. A.I. has a lot of potential to really change video production, it’s just a bit over-hyped right now. We, like some other companies, are trying to find the best way to incorporate it into our products because once that is figured out, it’s likely to make editors much more efficient and eliminate some tasks that are total drudgery. OTOH, VR is a parlor trick that, other than some very niche uses, is going to go the way of 3D TV and won’t change anything.

Jim Tierney
Chief Executive Anarchist
Digital Anarchy

Just Say No to A.I. Chatbots

For all the developments in artificial intelligence, one of the consistently worst uses of it is with chatbots. Those little ‘Chat With Us’ side bars on many websites. Since we’re doing a lot with artificial intelligence (A.I.) in Transcriptive and in other areas, I’ve gotten very familiar with how it works and what the limitations are. It starts to be easy to spot where it’s being used, especially when it’s used badly.

So A.I. chatbots, which really doesn’t work well, have become a bit of a pet peeve of mine. If you’re thinking about using them for your website, you owe it to yourself to  click around the web and see how often ‘chatting’ gets you a usable answer. It’s usually just frustrating. You go a few rounds with a cheery chatbot before getting to what you were going to do in the first place… send a message that will be replied to by a human. Total waste of time and doesn’t answer the questions.

Artificial intelligence isn't great for chatbotsDo you trust cheery, know-nothing chatbots with your customers?

The main problem is that chatbots don’t know when to quit. I get it that some business receive the same question over and over… where are you located? what are your hours? Ok, fine, have a chatbot act as a FAQ. But the chatbot needs to quickly hand off the conversation to a real person if the questions go beyond what you could have in an FAQ. And frankly, an FAQ would be better than trying to fake-out people with your A.I. chatbot. (honesty and authenticity matter, even on the web)

A.I. is just not great at reading comprehension. It can get the jist of things usually, which I think is useful for analytics and business intelligence. But this doesn’t allow it to respond with any degree of accuracy or intelligence. For responding to customer queries it produces answers that are sort of close… but mostly unusable. So, the result is frustrated customers.

Take a recent experience with Audi. I’m looking at buying a new car and am interested in one of their SUVs. I went onto an Audi dealer site to inquire about a used one they had. I wanted to know 1) was it actually in stock and 2) how much of the original warranty was left since it was a 2017? There was a button to send a message which I was originally going to use but decided to try the chat button that was bouncing up and down getting my attention.

So, I asked those questions in the chat. If it had been a real person, they definitely could have answered #1 and probably #2, even if they were just an assistant. But no, I ended in the same place I would’ve been if I’d just clicked ‘send a message’ in the first place. But first, I had to get through a bunch of generic answers that didn’t answer any of my questions and just dragged me around in circles. This is not a good way to deal with customers if you’re trying to sell them a $40,000 car.

And don’t get me started on Amazon’s chatbots. (and emailbots for that matter)

It’s also funny to notice how the chatbots try and make you think it’s human, with misspelled words and faux emotions. I’ve had a chatbot admonish me with ‘I’m a real person…’ when I called it a chatbot. It then followed that with another generic answer that didn’t address my question. The Pinocchio chatbot… You’re not a real boy, not a real person and you don’t get to pass Go and collect $200. (The real salesperson I eventually talked to confirmed it was a chatbot.)

I also had one threaten to end the chat if I didn’t watch my language, which was not aimed at the chatbot. I just said, “I just want this to f’ing work”. A little generic frustration. However, after it told me to watch my language, I went from frustrated to kind of pissed. So much for artificial intelligence having emotional intelligence. Getting faux-insulted over something almost any real human would recognize as low grade frustration, is not going to make customers happier.

I think A.I. has some amazing uses, Transcriptive makes great use of A.I. but it also has a LOT of shortcomings. All of those shortcomings are glaringly apparent when you look at chatbots. There are, of course, many companies trying to create conversational A.I. but so far the results have been pretty poor.

Based on what I’ve seen developing products with A.I., I think it’s likely it’ll be quite a while before conversational A.I. is a good experience on a regular basis. You should think very hard about entrusting your customers to it. A web form or FAQ is going to be better than a frustrating experience with a ‘sales person’.

Not sure what this has to do with video editing. Perhaps just another example of why A.I. is going to have a hard time editing anything that requires comprehending the content. Furthering my belief that A.I. isn’t going to replace most video editors any time soon.

Artificial Intelligence vs. Video Editors

With Transcriptive, our new tool for doing automated transcriptions, we’ve dove into the world of A.I. headfirst. So I’m pretty familiar with where the state of industry is right now. We’ve been neck deep in it for the last year.

A.I. is definitely changing how editors get transcripts and search video for content. Transcriptive demonstrates that pretty clearly with text.  Searching via object recognition is something that also is already happening. But what about actual video editing?

One of the problems A.I. has is finishing. Going the last 10% if you will. For example, speech-to-text engines, at best, have an accuracy rate of about 95% or so. This is about on par with the average human transcriptionist. For general purpose recordings, human transcriptionists SHOULD be worried.

But for video editing, there are some differences, which are good news. First, and most importantly, errors tend to be cumulative. So if a computer is going to edit a video, at the very least, it needs to do the transcription and it needs to recognize the imagery. (we’ll ignore other considerations like style, emotion, story for the moment) Speech recognition is at best 95%, object recognition is worse. The more layers of AI you have, usually those errors will multiply (in some cases there might be improvement though) . While it’s possible automation will be able to produce a decent rough cut, these errors make it difficult to see automation replacing most of the types of videos that pro editors are typically employed for.

Secondly, if the videos are being done for humans, frequently the humans don’t know what they want. Or at least they’re not going to be able to communicate it in such a way that a computer will understand and be able to make changes. If you’ve used Alexa or Echo, you can see how well A.I. understands humans. Lots of situations, especially literal ones (find me the best restaurant), it works fine, lots of other situations, not so much.

Many times as an editor, the direction you get from clients is subtle or you have to read between the lines and figure out what they want. It’s going to be difficult to get A.I.s to take the way humans usually describe what they want, figure out what they actually want and make those changes.

Third… then you get into the whole issue of emotion and storytelling, which I don’t think A.I. will do well anytime soon. The Economist recently had an amusing article where it let an A.I. write the article. The result is here. Very good at mimicking the style of the Economist but when it comes to putting together a coherent narrative… ouch.

It’s Not All Good News

There are already phone apps that do basic automatic editing. These are more for consumers that want something quick and dirty. For most of the type of stuff professional editors get paid for, it’s unlikely what I’ve seen from the apps will replace humans any time soon. Although, I can see how the tech could be used to create rough cuts and the like.

Also, for some types of videos, wedding or music videos perhaps, you can make a pretty solid case that A.I. will be able to put something together soon that looks reasonably professional.

You need training material for neural networks to learn how to edit videos. Thanks to YouTube, Vimeo and the like, there is an abundance of training material. Do a search for ‘wedding video’ on YouTube. You get 52,000,000 results. 2.3 million people get married in the US every year. Most of the videos from those weddings are online. I don’t think finding a few hundred thousand of those that were done by a professional will be difficult. It’s probably trivial actually.

Same with music videos. There IS enough training material for the A.I.s to learn how to do generic editing for many types of videos.

For people that want to pay $49.95 to get their wedding video edited, that option will be there. Probably within a couple years. Have your guests shoot video, upload it and you’re off and running. You’ll get what you pay for, but for some people it’ll be acceptable. Remember, A.I. is very good at mimicking. So the end result will be a very cookie cutter wedding video. However, since many wedding videos are pretty cookie cutter anyways… at the low end of the market, an A.I. edited video may be all ‘Bridezilla on A Budget’ needs. And besides, who watches these things anyways?

Let The A.I Do The Grunt Work, Not The Editing

The losers in the short term may be assistant editors. Many of the tasks A.I. is good for… transcribing, searching for footage, etc.. is now typically given to assistants. However, it may simply change the types of tasks assistant editors are given. There’s a LOT of metadata that needs to be entered and wrangled.

While A.I. is already showing up in many aspects of video production, it feels like having it actually do the editing is quite a ways off.  I can see creating A.I. tools that help with editing: Rough cut creation, recommending color corrections or B roll selection, suggesting changes to timing, etc. But there’ll still need to be a person doing the edit.

 

Speeding Up De-flickering of Time Lapse Sequences in Premiere

Time lapse is always challenging… you’ve got a high resolution image sequence that can seriously tax your system. Add Flicker Free on top of that… where we’re analyzing up to 21 of those high resolution images… and you can really slow a system down. So I’m going to go over a few tips for speeding things up in Premiere or other video editor.

First off, turn off Render Maximum Depth and Maximum Quality. Maximum Depth is not going to improve the render quality unless your image sequence is HDR and the format you’re saving it to supports 32-bit images. If it’s just a normal RAW or JPEG sequence, it  won’t make much of a difference. Render Maximum Quality may make a bit of difference but it will likely be lost in whatever compression you use. Do a test or two to see if you can tell the difference (it does improve scaling) but I rarely can.

RAW: If at all possible you should shoot your time lapses in RAW. There are some serious benefits which I go over in detailed in this video: Shooting RAW for Time Lapse. The main benefit is that Adobe Camera RAW automatically removes dead pixels. It’s a big f’ing deal and it’s awesome. HOWEVER… once you’ve processed them in Adobe Camera RAW, you should convert the image sequence to a movie or JPEG sequence (using very little compression). It will make processing the time lapse sequence (color correction, effects, deflickering, etc.) much, much faster. RAW is awesome for the first pass, after that it’ll just bog your system down.

Nest, Pre-comp, Compound… whatever your video editing app calls it, use it. Don’t apply Flicker Free or other de-flickering software to the original, super-high resolution image sequence. Apply it to whatever your final render size is… HD, 4K, etc.

Why? Say you have a 6000×4000 image sequence and you need to deliver an HD clip. If you apply effects to the 6000×4000 sequence, Premiere will have to process TWELVE times the amount of pixels it would have to process if you applied it to HD resolution footage. 24 million pixels vs. 2 million pixels. This can result in a HUGE speed difference when it comes time to render.

How do you Nest?

This is Premiere-centric, but the concept applies to After Effects (pre-compose) or FCP (compound) as well. (The rest of this blog post will be explaining how to Nest. If you already understand everything I’ve said, you’re good to go!)

First, take your original image sequence (for example, 6000×4000 pixels) and put it into an HD sequence. Scale the original footage down to fit the HD sequence.

Hi-Res images inside an HD sequenceThe reason for this is that we want to control how Premiere applies Flicker Free. If we apply it to the 6000×4000 images, Premiere will apply FF and then scale the image sequence. That’s the order of operations. It doesn’t matter if Scale is set to 2%. Flicker Free (and any effect) will be applied to the full 6000×4000 image.

So… we put the big, original images into an HD sequence and do any transformations (scaling, adjusting the position and rotating) here. This usually includes stabilization… although if you’re using Warp Stabilizer you can make a case for doing that to the HD sequence. That’s beyond the scope of this tutorial, but here’s a great tutorial on Warp Stabilizer and Time Lapse Sequences.

Next, we take our HD time lapse sequence and put that inside a different HD sequence. You can do this manually or use the Nest command.

Apply Flicker Free to the HD sequence, not the 6000x4000 imagesNow we apply Flicker Free to our HD time lapse sequence. That way FF will only have to process the 1920×1080 frames. The original 6000×4000 images are hidden in the HD sequence. To Flicker Free it just looks like HD footage.

Voila! Faster rendering times!

So, to recap:

  • Turn off Render Maximum Depth
  • Shoot RAW, but apply Flicker Free to a JPEG sequence/Movie
  • Apply Flicker Free to the final output resolution, not the original resolution

Those should all help your rendering times. Flicker Free still takes some time to render, none of the above will make it real time. However, it should speed things up and make the render times more manageable if you’re finding them to be really excessive.

Flicker Free is available for Premiere Pro, After Effects, Final Cut Pro, Avid, Resolve, and Assimilate Scratch. It costs $149. You can download a free trial of Flicker Free here.

Getting transcripts for Premiere Multicam Sequences

Using Transcriptive with multicam sequences is not a smooth process and doesn’t really work. It’s something we’re working on coming up with a solution for but it’s tricky due to Premiere’s limitations.

However, while we sort that out, here’s a workaround that is pretty easy to implement. Here are the steps:

1- Take the clip with the best audio and drop it into it’s own sequence.
Using A.I. to transcribe Premiere Multicam Sequences
2- Transcribe that sequence with Transcriptive.
3- Now replace that clip with the multicam clip.
Transcribing multicam in Adobe premiere pro

4- Voila! You have a multicam sequence with a transcript. Edit the transcript and clip as you normally would.

This is not a permanent solution and we hope to make it much more automatic to deal with Premiere’s multicam clips. In the meantime, this technique will let you get transcripts for multicam clips.

Thanks to Todd Drezner at Cohn Creative for suggesting this workaround.

How Doc Filmmakers Are using A.I. to Create Captions and Search Footage in Premiere Pro

Artificial Intelligence (A.I.) and machine learning are changing how video editors deal with some common problems. 1) how do you get accurate transcriptions for captions or subtitles? And 2) how do you find something in hours of footage if you don’t know exactly where it is?

Getting out of the Transcription Dungeon

Kelley Slagle, director, producer and editor for Cavegirl Productions, has been working on Eye of the Beholder, a documentary on the artists that created the illustrations for the Dungeons and Dragon game. With over 40 hours of interview footage to comb through searching through it all has been made much easier by Transcriptive, a new A.I. plugin for Adobe Premiere Pro.


eye-beholder 

Why Transcribe?

Imagine having Google for your video project. Turning all the dialog into text makes everything easily searchable (and it supports 28 languages). Not too mention making it easy to create captions and subtitles.

The Dragon of Time And Money

Using a traditional transcription service for 40 hours of footage, you’re looking at a minimum of $2400 and a few days to turn it all around. Not exactly cost or time effective. Especially if you’re on a doc budget. However, it’s a problem for all of us.

Transcriptive helps solve the transcription problem, and the problems of searching video and captions/subtitles. It uses A.I. and machine learning to automatically generate transcripts with up to 95% accuracy and bring them into Premiere Pro. And the cost? About $4/hour (or much less depending on the options you choose) So, 40 hours is $160 vs $2400. And you’ll get all of it back in a few hours.

Yeah, it’s hard to believe.

Read what these three filmmakers have to say and try the Transcriptive demo out on your own footage. It’ll make it much easier to believe.

 

“We are using Transcriptive to transcribe all of our interviews for EYE OF THE BEHOLDER. The idea of paying a premium for that much manual transcription was daunting. I am in the editing phase now and we are collaborating with a co-producer in New York. We need to share our ideas for edits and content with him, so he is reviewing transcripts generated by Transcriptive and sending us his feedback and vice versa. The ability to get a mostly accurate transcription is fine for us, as we did not expect the engine to know proper names of characters and places in Dungeons & Dragons.” – Kelley Slagle, Cavegirl Productions

Google Your Video Clips and Premiere Project?

 

Since everything lives right within Premiere, all the dialog is fully searchable. It’s basically a word processor designed for transcripts, where every word has time code. Yep, every word of dialog has time code. Click on the word and jump to that point on the timeline. This means you don’t have to scrub through footage to find something. Search and jump right to it. It’s an amazing way for an editor to find any quote or quip.

As Kelley says, “We are able to find what we need by searching the text or searching the metadata thanks to the feature of saving the markers in our timelines. As an editor, I am now able to find an exact quote that one of my co-producers refers to, or find something by subject matter, and this speeds up the editing process greatly.”

Joy E. Reed of Oh My! Productions, who’s directing the documentary, ‘Ren and Luca’ adds, “We use sequence markers to mark up our interviews, so when we’re searching for specific words/phrases, we can find them and access them nearly instantly. Our workflow is much smoother once we’ve incorporated the Transcriptive markers into our project. We now keep the Markers window open and can hop to our desired areas without having to flip back and forth between our transcript in a text document and Premiere.”

Workflow, Captions, and Subtitles

ren-luca-L

Captions and subtitles are one of the key uses of Transcriptive. You can use it with the Premiere’s captioning tool or export many different file formats (SRT, SMPTE, SCC, MCC, VTT, etc) for use in any captioning application.

“We’re using Transcriptive to transcribe both sit down and on-the-fly interviews with our subjects. We also use it to get transcripts of finished projects to create closed captions/subtitles.”, says Joy. “We can’t even begin to say how useful it has been on Ren and Luca and how much time it saves us. The turnaround time to receive the transcripts is SO much faster than when we sent it out to a service. We’ve had the best luck with Speechmatics. The transcripts are only as accurate as our speakers – we have a teenage boy who tends to mumble, and his stuff has needed more tweaking than some of our other subjects, but it has been great for very clearly recorded material. The time it saves vs the time you need to tweak for errors is significant.”

captions

Transcriptive is fully integrated into Premiere Pro, you never have to leave the application or pass metadata and files around. This makes creating captions much easier, allowing you to easily edit each line while playing back the footage. There are also tools and keyboard shortcuts to make the editing much faster than a normal text editor. You then export everything to Premiere’s caption tool and use that to put on the finishing touches and deliver them with your media.

Another company doing documentary work is Windy Films. They are focused on telling stories of social impact and innovation, and like most doc makers are usually on tight budgets and deadlines. Transcriptive has been critical in helping them tell real stories with real people (with lots of real dialog that needs transcribing).

They recently completed a project for Planned Parenthood. The deadline was incredibly tight. Harvey Burrell, filmmaker at Windy, says, “We were trying to beat the senate vote on the healthcare repeal bill. We were editing while driving back from Iowa to Boston. The fact that we could get transcripts back in a matter of hours instead of a matter of days allowed us to get it done on time. We use Transcriptive for everything. The integration into premiere has been incredible. We’ve been getting transcripts done for a long time. The workflow was always a clunky; particularly to have transcripts in a word document off to one side. Having the ability to click on a word and just have Transcriptive take you there in the timeline is one of our favorite features.”

Getting Accurate Transcripts using A.I.

 

Audio quality matters. So the better the recording and the more the talent enunciates correctly, the better the transcript. You can get excellent results, around 95% accuracy, with very well recorded audio. That means your talent is well mic’d, there’s not a lot of background noise and they speak clearly. Even if you don’t have that, you’ll still usually get very good results as long as the talent is mic’d. Even accents are ok as long as they speak clearly. Talent that’s off mic or if there’s crosstalk will cause it to be less accurate.

6-Full-Screen

Transcriptive lets you sign up with the speech services directly, allowing you to get the best pricing. Most transcription products hide the service they’re using (they’re all using one of the big A.I. services), marking up the cost per minute to as much as .50/min. When you sign up directly, you get Speechmatics for $0.07/min. And Watson gives you the first 1000 minutes free. (Speechmatics is much more accurate but Watson can be useful).

Transcriptive itself costs $299 when you check out of the Digital Anarchy store. A web version is coming soon as well. To try transcribing with Transcriptive you can download the trial version here. (remember, Speechmatics is the more accurate service and the only service available in the demo) Reach out to sales@nulldigitalanarchy.com if you have questions or want an extended trial.

Transcriptive is a plugin that many didn’t know they were waiting for. It is changing the workflow of many editors in the industry. See for yourself how we’re transforming the art of transcription.

What Exactly is Adobe TypeKit?

So let’s talk about something that’s near and dear to my heart: Fonts.

I recently discovered Adobe TypeKit. I know…some of you are like… ‘You just discovered that?’.

Yeah, yeah… well, in case there are other folks that are clueless about this bit of the Creative Cloud that’s included with your subscription: It’s a massive font library that can be installed on your Creative Cloud machine… much of which is free (well, included in the cost of CC).

Up until a week ago I just figured it was a way for Adobe to sell fonts. I was mistaken. You find the font you like and, more often than not, you click the SYNC button and, boom… font is installed on your machine for use in Photoshop or After Effects or whatever.

Super cool feature of Creative Cloud that if you’re as clued in as I am about everything CC includes… you might not know about. Now you do. :-) Here’s a bit more info from Adobe.

I realize this probably comes off as a bit of an ad for TypeKit, but it really is pretty cool. I just designed a logo using a new font I found there. And since it’s Adobe, the fonts are of really high quality, not like what you find on free font sites (which is what I’ve relied on for many uses).

F’ing GPUs

One of the fun challenges of developing graphics software is dealing with the many, varied video cards and GPUs out there. (actually, it’s a total pain in the ass. Hey, just being honest :-)

There are a lot of different video cards out there and they all have their quirks. Which are complicated by the different operating systems and host applications… for example, Apple decides they’re going to more or less drop OpenCL in favor of Metal, which means we have to re-write quite a bit of code, Adobe After Effects and Adobe Premiere Pro handle GPUs differently even though it’s the same API, etc. etc. From the end user side of things you might not realize how much development goes into GPU Acceleration. It’s a lot.

The latest release of Beauty Box Video for Skin Retouching (v4.1) contains a bunch of fixes for video cards that use OpenCL (AMD, Intel). So if you’re using those cards it’s a worthwhile download. If you’re using Resolve and Nvidia cards, you also want to download it as there’s a bug with CUDA and Resolve and you’ll want to use Beauty Box in OpenCL mode until we fix the CUDA bug. (Probably a few weeks away) Fun times in GPU-land.

4.1 is a free update for users of the 4.0 plugin. Download the demo and it should automatically remove the older version and recognize your serial number.

Just wanted to give you all some insight on how we spend our days around here and what your hard earned cash goes into when you buy a plugin. You know, just in case you’re under the impression all software developers do is ‘work’ at the beach and drive Ferraris around. We do have fun, but usually it involves nailing the video card of the month to the wall and shooting paintballs at it. ;-)

Creating the Grinch on Video Footage with The Free Ugly Box Plugin

We here at Digital Anarchy want to make sure you have a wonderful Christmas and there’s no better way to do that than to take videos of family and colleagues and turn them into the Grinch. They’ll love it! Clients, too… although they may not appreciate it as much even if they are the most deserving. So just play it at the office Christmas party as therapy for the staff that has to deal with them.

Our free plugin Ugly Box will make it easy to do! Apply it to the footage, click Make Ugly, and then make them green! This short tutorial shows you how:

You can download the free Ugly Box plugin for After Effects, Premiere Pro, Final Cut Pro, and Avid here:

https://digitalanarchy.com/register/register_ugly.php

Of course, if you want to make people look BETTER, there’s always Beauty Box to help you apply a bit of digital makeup. It makes retouching video easy, get more info on it here:

https://digitalanarchy.com/beautyVID/main.html

De-flickering Bix Pix’s Stop Motion Animation Show ‘Tumble Leaf’ with Flicker Free

Like Digital Anarchy On FacebookLike us on Facebook!

One of the challenges with stop motion animation is flicker. Lighting varies slightly for any number of reasons causing the exposure of every frame to be slightly different. We were pretty excited when Bix Pix Entertainment bought a bunch of Flicker Free licenses (our deflicker plugin) for Adobe After Effects. They do an amazing kids show for Amazon called Tumble Leaf that’s all stop motion animation. It’s won multiple awards, including an Emmy for best animated preschool show.

Many of us, if not most of us, that do VFX software are wannabe (or just flat out failed ;-) animators. We’re just better at the tech than the art. (exception to the rule: Bob Powell, one of our programmers, who was a TD at Laika and worked on Box Trolls among other things)

So we love stop motion animation. And Bix Pix does an absolutely stellar job with Tumble Leaf. The animation, the detailed set design, the characters… are all off the charts. I’ll let them tell it in their own words (below). But check out the 30 second deflicker example below (view at full screen as the Vimeo compression makes the flicker hard to see). I’ve also embedded their ‘Behind The Scenes’ video at the end of the article. If you like stop motion, you’ll really love the ‘Behind the Scenes’.

From the Bix Pix folks themselves… breaking down how they use Flicker Free  in their Adobe After Effects workflow:

——————————————————————-

Using Digital Anarchy’s Flicker Free at Bix Pix

Bix Pix Entertainment is an animation studio that specializes in the art of stop-motion animation, and is known for their award-winning show Tumble Leaf on Amazon Prime.

It is not uncommon for an animator to labor for days sometimes weeks on a single stop motion shot, working frame by frame. With this process, it is natural to have some light variations between each exposure, commonly referred to as ‘flicker’ – There are many factors that can cause the shift in lighting. For instance, a studio light or lights may blow out or solar flare. Voltage and/ power surges can brighten or dim lights over a long shot. Certain types of lights, poor lighting equipment, camera malfunctions or incorrect camera settings. Sometimes an animator might wear a white t-shirt unintentionally adding fill to the shot or accidentally standing in front of a light casting a shadow from his or her body.

The variables are endless. Luckily these days compositors and VFX artists have fantastic tools to help remove these unwanted light shifts. Removing unwanted light shifts and flicker is a very important and necessary first step when working with stop-motion footage. Unless by chance it’s an artistic decision to leave that tell-tale flicker in there. But that is a rare decision that does not come about often.

Here at Bix Pix we use Adobe After Effects for all of our compositing and clean-up work. Having used 4 different flicker removal plugins over the years, we have to say Digital Anarchy’s flicker Free is the fastest, easiest and most effective flicker removal software we have come across. And also quite affordable.

During a season of Tumble Leaf we will process between 1600 and 2000 shots averaging between 3 seconds and up to a couple minutes in length. That is an average of about 5 hours of footage per season, almost three times the length of a feature film. With a tight schedule of less than a year and a small team of ten or so VFX artists and compositors. Nearly every shot has an instance of flicker free applied to it as an effect. The plugin is so fast, simple to use and reliable. De-flickering can be done in almost real time.

Digital Anarchy’s Flicker free has saved us thousands of hours of work and reduced overtime and crunch time delays. This not only saves money but frees up artists to do more elaborate effects that we could not do before due to time constraints, allowing them to focus on making their work stand out even more.

If you are shooting stop-motion animation and require flicker free footage, this is the plugin to use.

———————————————–

For a breakdown of how they do Tumble Leaf, you should definitely check out the Behind the Scenes video!

I even got to meet the lead character, Fig! My niece and nephew (4 and 6) were very impressed. :-)

Hanging out with Fig at BixPix Entertainment

Cheers,
Jim Tierney
Chief Executive Anarchist
Digital Anarchy

Sharpening Video Footage

Like Digital Anarchy On Facebook

 

Sharpening video can be a bit trickier than sharpening photos. The process is the same of course… increasing the contrast around edges which creates the perception of sharpness.

However, because you’re dealing with 30fps instead of a single image some additional challenges are introduced:

1- Noise is more of a problem.
2- Video is frequently compressed more heavily than photos, so compression artifacts can be a serious problem.
3- Oversharpening is a problem with stills or video but can create motion artifacts when the video is played back that can be visually distracting.
4- It’s more difficult to mask out areas like skin that you don’t want sharpened.

These are problems you’ll run into regardless of the sharpening method. However, probably unsurprising, in addition to discussing the solutions using regular tools, we do talk about how our Samurai Sharpen plugin can help with them.

Noise in Video Footage

Noise is always a problem regardless of whether you’re shooting stills or videos. However, with video the noise changes from frame to frame making it a distraction to the viewer if there’s too much or it’s too pronounced.

Noise tends to be much more obvious in dark areas, as you can see below where it’s most apparent in the dark, hollow part of the guitar:

You can use Samurai Sharpen to avoid sharpening noise in video footage

Using a mask to protect the darker areas makes it possible to increase the sharpening for the rest of the video frame. Samurai Sharpen has masks built-in, so it’s easy in that plugin, but you can do this manually in any video editor or compositing program by using keying tools, building a mask and compositing effects.

Compression Artifacts

Many consumer video cameras, including GoPros and some drone cameras heavily compress footage. Especially when shooting 4K.

It can be difficult to sharpen video that's been heavily compressed

It’s difficult, and sometimes impossible to sharpen footage like this. The  compression artifacts become very pronounced, since they tend to have edges like normal features. Unlike noise, the artifacts are visible in most areas of the footage, although they tend to be more obvious in areas with lots of detail.

In Samurai you can increase the Edge Mask Strength to lessen the impact of sharpening on the artifact (often they’re in low contrast) but depending on how compressed the footage is you may not want to sharpen it.

Oversharpening

Sharpening is a local contrast adjustment. It’s just looking at significant edges and sharpening those areas. Oversharpening occurs when there’s too much contrast around the edges, resulting in visible halos.

Too much sharpening of video can result in visible halos
Especially if you look at the guitar strings and frets, you’ll see a dark halo on the outside of the strings and the strings themselves are almost white with little detail. Way too much contrast/sharpening. The usual solution is to reduce the sharpening amount.

In Samurai Sharpen you can also adjust the strength of the halos independently. So if the sharpening results in only the dark or light side being oversharpened, you can dial back just that side.

Sharpening Skin

The last thing you usually want to do is sharpen someone’s skin. You don’t want your talent’s skin looking like a dried-up lizard. (well, unless your talent is a lizard. Not uncommon these days with all the ridiculous 3D company mascots)

Sharpening video can result in skin being looking rough

Especially with 4K and HD, video is already showing more skin detail than most people want (hence the reason for our Beauty Box Video plugin for digital makeup). If you’re using UnSharp Mask you can use the Threshold parameter, or in Samurai the Edge Mask Strength parameter is a more powerful version of that. Both are good ways of protecting the skin from sharpening. The skin area tends to be fairly flat contrast-wise and the Edge Mask generally does a good job of masking the skin areas out.

Either way, you want to keep an eye on the skin areas, unless you want a lizard. (and if so, you should download are free Ugly Box plugin. ;-)

Wrap Up

You can sharpen video and most video footage will benefit from some sharpening. However, there are numerous issues that you run into and hopefully this gives you some idea of what you’re up against whether you’re using Samurai Sharpen for Video or something else.

My Hopes for Open-Hearted, Strong America

I usually don’t mix politics and business. However, I feel this is an extraordinary election. I encourage you to get out and vote.

I am hopeful that tomorrow we will have our first woman president. I am hopeful that America can rise above the hate, fear and pettiness that has defined Donald Trump’s campaign. I am hopeful that we can live up to the words on the Statue of Liberty… “Give me your tired, your poor, your huddled masses yearning to breathe free, the wretched refuse of your teeming shore. Send these, the homeless, tempest-tossed to me, I lift my lamp beside the golden door!”

We are a nation of immigrants. That is one of the things that makes America great. People of all cultures want to come here not to change our culture but to live it! Perhaps add a bit of their culture as a flourish, but they come here because they believe, as I did when I used to say the pledge of allegiance in school, that America represents equality, freedom (including freedom of religion), and opportunity for everyone. Perhaps that’s not as true as it could be but I’ve always felt we at least aspire to that.

I am hopeful that America still wants to aspire to that… and not the racism, xenophobia, and small mindedness that Trump represents.

I am pro-business, but I am also pro-people. Trump is neither. Good businessmen don’t bankrupt companies on a regular basis, screwing employees, investors and partners. Even in Silicon Valley where failure is sometimes a badge of honor, Trump’s record is dismal. This is why Mark Cuban offered Trump $10 million to give details on his policy proposals.

Any entrepreneur that’s run a business knows you aren’t going to succeed without a plan. Trump has no plan.

I want to see America continue to succeed and continue it’s greatness. I think we can do better for those that have not benefitted from an increasingly global world. I think we can integrate immigrants, as we ALWAYS have, giving them opportunities while benefitting from the skills and perspective they bring. I think we can educate all Americans, poor as well as rich, black/brown as well as white, so they can take advantage of the opportunities the world has to offer.

Hillary may not be perfect (none of us are) but she has a plan and knows how the government and the world works. I have far more faith in her to achieve what needs to be done than I do in Trump who will likely bankrupt the country like he has his companies.

I care about America and I care about her people. I think this country is already great. I think we can aspire to be even better. But it requires compassion and acceptance as well as dedication and hard work. It is time for a woman to lead this country, someone who can bring all those qualities to the table.

I sincerely hope that we can be the open-hearted, strong country that we’ve usually been and not succumb to fear and close-mindedness. I believe we can.

Do not use Norton Anti-Virus

Like Digital Anarchy On Facebook

 

We highly recommend against using Norton Anti-Virus. In an attempt to be smart, they proactively quarantine programs because “fewer than 50 users in the Norton community have them”. This means many of our plugins get quarantined when you try to install them.

Our installers pose no threat and you can safely install them.

Here’s what Norton puts up:

Do not use Norton Anti-virus as it's unreliable

1- It describes the risk as Heur.AdvML.C and labels it a ‘heuristic virus’ which sounds scary and looks like a virus name. It’s not. It’s a Norton code for their ‘artificial intelligence’. If this is how smart AI is, it’s going to be a long time before the bots take over the world.

2- Our major crimes against humanity seem to be that less than 50 users have installed this and it was uploaded over 4 months ago.

That’s it. So Norton’s ‘malware heuristics’ AI has decided were a High threat.

This is misleading and doing a disservice to us and our users. I assume most other plugins from small companies will fall under the same umbrella of stupidity.

As such we recommend you use a different anti-virus software.

Thoughts on The Mac Pro and FCP X

Like Digital Anarchy On Facebook

 

There’s been some talk of the eminent demise of the Mac Pro. The Trash Can is getting quite old in the tooth… it was overpriced and underpowered to begin  with and is now pretty out of date. Frankly it’d be nice if Apple just killed it and moved on. It’s not where they make their money and it’s clear they’re not that interested in making machines for the high end video production market. At the very least, it would mean we (Digital Anarchy) wouldn’t have to buy Trash Can 2.0 just for testing plugins. I’m all for not buying expensive machines we don’t have any use for.

But if they kill off the Mac Pro, what does that mean for FCP X? Probably nothing. It’s equally clear the FCP team still cares about pro video. There were multiple folks from the FCP team at NAB this year, talking to people and showing off FCP at one of the sub-conferences. They also continue to add pro-level features.

That said, they may care as much (maybe even more) about the social media creators… folks doing YouTube, Facebook, and other types of social media creation. There are a lot of them. A lot more than folks doing higher end video stuff, and these creators are frequently using iPhones to capture and the Mac to edit. They aren’t ‘pro editors’ and I think that demographic makes up a good chunk of FCP users. It’s certainly the folks that Apple, as a whole, is going after in a broader sense.

If you don’t think these folks are a significant focus for Apple overall, just look at how much emphasis they’ve put on the camera in the iPhone 6 & 7… 240fps video, dual lenses, RAW shooting, etc. To say nothing of all the billboards with nothing but a photo ‘taken with the iPhone’. Everyone is a media creator now and ‘Everyone’ is more important to Apple than ‘Pro Editors’.

The iMacs are more than powerful enough for those folks and it wouldn’t surprise me if Apple just focused on them. Perhaps coming out with a couple very powerful iMacs/MacBook Pros as a nod to professionals, but letting the MacPro fade away.

Obviously, as with all things Apple, this is just speculation. However, given the lack of attention professionals have gotten over the last half decade, maybe it’s time for Apple to just admit they have other fish to fry.

Tutorial: Removing Flicker from Edited Video Footage

Like Digital Anarchy On Facebook

 

One problem that users can run into with our Flicker Free deflicker plugin is that it will look across edits when analyzing frames for the correct luminance. The plugin looks backwards as well as forwards to gather frames and does a sophisticated blend of all those frames. So even if you create an edit, say to remove an unwanted camera shift or person walking in front of the camera, Flicker Free will still see those frames.

This is particularly a problem with Detect Motion turned OFF.

The way around this is to Nest (i.e. Pre-compose (AE), Compound Clip (FCP)) the edit and apply the plugin to the new sequence. The new sequence will start at the first frame of the edit and Flicker Free won’t be able to see the frames before the edit.

This is NOT something you always have to do. It’s only if the frames before the edit are significantly different than the ones after it (i.e. a completely different scene or some crazy camera movement). 99% of the time it’s not a problem.

This tutorial shows how to solve the problem in Premiere Pro. The technique works the same in other applications. Just replacing ‘Nesting’ with whatever your host application does (pre-composing, making a compound clip, etc).

Is The iPhone A Real Camera?

For whatever reason I’ve seen several articles/posts over the last few days about whether you can be a photo/videographer with a camera phone. Usually the argument is that just because the iPhone (or whatever) can take the occasional good video/pictures, it doesn’t make you a good videographer. Of course not. Neither does a 5Dm4 or an Arri Alexa.

Camera phones can be used for professional video.

But what if you have a good eye and are a decent videographer? I think a lot of the hand wringing comes from people that have spent a lot of money on gear and are seeing people get great shots with their phone. It’s not going to change. The cameras in a lot of phones are really good and if you have a bit of skill, it can go a long way. You can check out this blog post comparing the iPhone’s slow motion video capabilities to a Sony FS700. The 10x price difference doesn’t beget a 10x quality difference.

There is obviously a place for long or fast lenses that you need a real camera for. There are definitely shots you won’t get with a phone. However, there are definitely shots you can get with a phone that you can’t get with your big, fancy camera. Partially just because you ALWAYS have your phone and partially because of the size. Sometimes the ability to spontaneously shoot is a huge advantage.

Then you add something like Dave Basaluto’s iOgrapher device and you’ve got a video camera capable of some great stuff, especially for stock or B roll.

There are issues for sure. Especially with these devices trying to shoot 4K, like a GoPro. It doesn’t matter how well lit and framed the shot is because it’s often got massive compression artifacts.

Overall though, the cameras are impressive and if you’ve got the skills, you can consistently get good to great shots.
What’s this got to do with Digital Anarchy? Absolutely nothing. We just like cool cameras no matter what form they take.  :-)

(and, yes, I’m looking forward to getting the new 5D mark4. It was finally time to upgrade the Digital Anarchy DSLR)

VR: Because Porn! (and Siggraph and other stuff)

Over the last few months I’ve been to NAB, E3, and Siggraph and seen a bunch of VR stuff.

VR people with their headsetsMost VR people with their headsets

One panel discussion about VR filmmaking was notable for the amount of time spent talking about all the problems VR has and how once they solve this or that major, non-trivial problem, VR will be awesome! One of these problems is that, as one of the panelist pointed out, anything over 6-8 minutes doesn’t seem to work. I’m supposed to run out and buy VR headsets for a bunch of shorts? Seriously?

E3 is mostly about big game companies and AAA game titles. However, if you go to a dark, back corner of the show floor you’ll find a few rows of small 10×20 booths. It was here that I finally found a VR experience that lived up to expectations! Porn. Yes, there was a booth at E3 showing hardcore VR porn. (I wonder if they told E3 what they were showing?)

One of my favorite statistics ever is that adult, pay-per-view movies in hotel rooms are watched, on average, for about 12 minutes. Finally! A use case for VR that matches up perfectly to its many limitations. You don’t need to worry about the narrative and no one is going to watch it for more than 12 minutes. Perfect. I’m sure the hot, Black Friday special at Walmart will be the Fleshlight/Oculus Rift bundle.

Surely There Are Other Uses Besides Porn?

Ok, sure, there are. I just haven’t found them to be compelling enough to justify all the excitement VR is getting. One booth at Siggraph was showing training on how to fix damaged power lines. This included a pole with sensors on the end of it that gave haptic (vibrations) feedback to the trainee and controlled the virtual pole in the VR environment. There are  niche uses like this that are probably viable.

There are, of course, games, which are VRs best hope for getting into the mainstream. These are MUCH more compelling in the wide open space of a tradeshow than I think they’re going to be in someone’s living room. For the rank and file gamer that doesn’t want to spend $8K on a body suit to run around their living room in… sitting on the couch with a headset is probably going to be less than an awesome experience after the novelty wears off. (and we don’t want to see the average gamer in a body suit. Really. We don’t.)

And then there are VR films. There was a pretty good 5 minute film called Giant being shown at Siggraph. Basically the story of parents and an 8 year old daughter in a basement in a war zone. You sat on a stool that could vibrate, strapped on the headset and you were sitting in a corner of this basement.  It was pretty intense.

However, the vibrating stool that allowed you to feel the bombs being dropped probably added more to the experience than VR. I think it probably would have been more intense as a regular film. The problem with VR is that you can’t do close-ups and multiple cameras. So a regular film would have been able to capture the emotions of the actors better. And it’s VR, so my tendency was to look around the basement rather than to focus on what was happening in the scene. There was very little interesting in the basement besides the actors, so it was just a big distraction.

So if your idea of a good time is watching game cinematics, which is what it felt like, then VR films are for you. And that was a good VR experience. Most VR film stuff I’ve seen are either 1) incredibly bland without a focus point or 2) uses the simulation of an intimate space to shock you. (Giant was guilty of this to some degree) The novelty of this is going to wear off as fast as a 3D axe thrown at the screen.

There are good uses for VR.  It just doesn’t justify the hype and excitement people are projecting onto it. For all the money that’s  pouring into it, it’s disappointing that the demos most companies are still showing (and expecting you to be excited about) are just 360 environments. “But Look! There are balloons falling from the sky! Isn’t it cool?!” Uh… yeah. Got any porn?

Comparing Beauty Box To other Video Plugins for Skin Retouching/Digital Makeup

We get a lot of questions about how Beauty Box compares to other filters out there for digital makeup. There’s a few things to consider when buying any plugin and I’ll go over them here. I’m not going to compare Beauty Box with any filter specifically, but when you download the demo plugin and compare it with the results from other filters this is what you should be looking at:

  • Quality of results
  • Ease of use
  • Speed
  • Support

Support

I’ll start with Support because it’s one thing most people don’t consider. We offer as good of support as anyone in the industry. You can email or call us (415-287-6069). M-F 10am-5pm PST. In addition, we also check email on the weekends and frequently in the evenings on weekdays. Usually you’ll get a response from Tor, our rockstar QA guy, but not infrequently you’ll talk to myself as well. Not often you get tech support from the guy that designed the software. :-)

Quality of Results

The reason you see Beauty Box used for skin retouching on everything from major tentpole feature films to web commercials, is the incredible quality of the digital makeup. Since it’s release in 2009 as the first plugin to specifically address skin retouching beyond just blurring out skin tones, the quality of the results has been critically acclaimed. We won several awards with version 1.0 and we’ve kept improving it since then. You can see many examples here of Beauty Box’s digital makeup, but we recommend you download the demo plugin and try it yourself.

Things to look for as you compare the results of different plugins:

Skin Texture: Does the skin look realistic? Is some of the pore structure maintained or is everything just blurry? It should, usually, look like regular makeup unless you’re going for a stylized effect.
Skin Color: Is there any change in skin tones?
Temporal Consistency: Does it look the same from frame to frame over time? Are there any noticeable seams where the retouching stops.
Masking: How accurate is the mask of the skin tones? Are there any noticeable seams between skin and non-skin areas? How easy is it to adjust the mask?

Ease of Use

One of the things we strive for with all our plugins is to make it as easy as possible to get great results with very little work on your end. Software should make your life easier.

In most cases, you should be able to click on Analyze Frame, make an adjustment to the Skin Smoothing amount to dial in the look you want and be good to go. There are always going to be times when it requires a bit more work but for basic retouching of video, there’s no easier solution than Beauty Box.

When comparing filters, the thing to look for here is how easy is it to setup the effect and get a good mask of the skin tones? How long does it take and how accurate is it?

Speed

If you’ve used Beauty Box for a while, you know that the only complaint we had with it with version 1.0 was that it was slow. No more! It’s now fully GPU optimized and with some of the latest graphics cards you’ll get real time performance, particularly in Premiere Pro. Premiere has added better GPU support and between that the Beauty Box’s use of the GPU, you can get real time playback of HD pretty easily.

And of course we support many different host apps, which gives you a lot of flexibility in where you can use it. Avid, After Effects, Premiere Pro, Final Cut Pro, Davinci Resolve, Assimilate Scratch, Sony Vegas, and NUKE are all supported.

Hopefully that gives you some things to think about as you’re comparing Beauty Box with other plugins that claim to be as good. All of these things factor into why Beauty Box is so highly regarded and considered to be well worth the price.

Back Care for Video Editors Part 3: Posture Exercises: The Good and The Bad

Like Digital Anarchy On Facebook

 

Posture Exercises: The Good and The Bad

There are a lot of books out there on how to deal with back pain. Most of them are relatively similar and have good things to say. Most of them also have minor problems, but overall, with a little guidance from a good physical therapist, they’re very useful.

Editing Video while sitting on ice is rather unusualYou don’t need to sit on ice to get good posture!

The two of I’ve been using are:

Back RX by Vijay Vad

8 Steps to a Pain Free Back (Gokhale Method)

Both have some deficiencies but overall are good and complement each other. I’ll talk about the good stuff first and get into my problems with them later (mostly minor issues).

There’s also another book, Healing Back Pain, which I’m looking into and says some valuable things. It posits that the main cause of the pain is not actually structural (disc problems, arthritis, etc) but in most cases caused by stress and the muscles tensing. I’ll do a separate post on it as I think the mind plays a significant role and this book has some merit.

BackRX

Back RX is a series of exercise routines designed to strengthen your back. It pulls from Yoga, Pilates, and regular physical therapy for inspiration. If you do them on a regular basis, you’ll start improving the strength in your abs and back muscles which should help relieve pain over the long term.

backRX

As someone that’s done Yoga for quite some time, partially in response to the repetitive stress problems I had from using computers, I found the routines very natural. Even if you haven’t done Yoga, the poses are mostly easy, many of them have you lying on the floor, and are healthy for your back. You won’t find the deep twisting and bending poses you might be encouraged to do at a regular yoga studio.

It also encourages mind/body awareness and focuses a bit on breathing exercises. The book doesn’t do a great job of explaining how to do this. If you’re not already a yoga practitioner or have a meditation practice you’ll need some guidance. The exercises have plenty of value even if you don’t get into that part of it. However, mindfulness is important. Here are a few resources on using meditation for chronic pain:

Full Catastrophe Living
Mindfulness Based Stress Reduction
You Are Not Your Pain

Gokhale Method

The 8 Steps to a Pain Free Back (Gokhale Method) is another good book that takes a different approach. BackRX provides exercise routines you can do in about 20 minutes. The Gokhale Method shows modifications to the things we do all the time… lying, sitting, standing, bending, etc. These are modifications you’re supposed to make throughout the day.

She has something of a backstory about how doctors these days don’t know what a spine should look like  and that people had different shaped spines in the past. In a nutshell, the argument is that because we’ve become so much more sedentary over the last 100 years (working in offices, couch potato-ing, etc) our spines are less straight and doctors now think this excessively curved spine is ‘normal’. I’m very skeptical of this as some of her claims are easily debunked (more on that later). However, it does not take away from the value of the exercises. Whether you buy into her marketing or not, she’s still promoting good posture and that’s the important bit.

Some of her exercises you will find similar to other Posture books. Other Gokhale exercises are novel. They may not all resonate with you, but I’ve found several to be quite useful.

Some good posture advice if you're sitting in front of a computerAll of the exercises focus on lengthening the spine and provide ways to hold that posture above and beyond the usual ‘Sit up straight!’. She sells a small cushion that mounts on the back of your chair. I’ve found this useful, if only in constantly reminding me to not slump in my Steelcase chair (completely offsetting why you spent the money on a fancy chair). It prevents me from leaning back in the chair, which is the first step to slumping. It also does help keep your back a bit more straight. There are some chairs that are not well designed and the cushion does help.

In both books, there’s an emphasis on stretching your spine and strengthening your ab/core muscles and back muscles. BackRX focuses more on the strengthening, Gokhale focuses more on the stretching.

But ultimately they only work if you’re committed to doing them over the long term. You also have to be vigilant about your posture. If you’re in pain, this isn’t hard as your back will remind with pain whenever you’re not doing things correctly. It’s harder if you’re just trying to develop good habits and you’re not in pain already.

Most people don’t think about this at all, which is why 80% of the US population will develop back pain problems at some point. So even if you only read the Gokhale book and just work on bending/sitting/walking better you’ll be ahead of the game.

So what are the problems with the books?

Both the Gokhale Method and BackRX have some issues. (again, these don’t really detract from the exercises in the book… but before you run out and tell your doctor his medical school training is wrong, you might want to consider these points)

Gokhale makes many claims in her book. Most of them involve how indigenous cultures sit/walk/etc and how little back pain is in those cultures. These are not easily testable. However, she makes other claims that can be tested. For one, she shows a drawing of a spine from around 1900 and drawing that she claims was in a recent anatomy book. She put this forth as evidence that spines used to look different and that modern anatomy books don’t show spines they way they’re supposed to look. This means modern doctors are being taught incorrectly and thus don’t know what a spine should look like. The reality is that modern anatomy books show spines that look nothing like her example, which is just a horrible drawing of a spine. In fact, illustrations of ‘abnormal’ spines are closer to what she has in her book.

Also, most of the spine illustrations from old anatomy books are pretty similar to modern illustrations. On average the older illustrations _might_ be slightly straighter than modern illustrations, but mostly they look very similar.

She also shows some pictures of statues to illustrate everyone in ancient times walked around with a straight back. She apparently didn’t take Art History in college and doesn’t realize these statues from 600 BC are highly stylized and were built like that because they lacked the technology to sculpt more lifelike statues. So, No, everyone in ancient Greece did not ‘walk like an Egyptian’.

BackRX has a different issue. Many of the photos they show of proper poses are correct for the Back, BUT not for the rest of the body. A common pose called Tree Pose is shown with the foot against the knee, similar to this photo:

How not to do tree pose - don't put your foot on your opposite knee This risks injury to the knee!  The foot should be against the side of the upper thigh.

Likewise, sitting properly at a desk is shown with good back posture, but with forearms and wrist positioned in such a way to ensure that the person will get carpel tunnel syndrome. These are baffling photos for a book discussing how to take care of your body.

Most of the exercises in this book are done lying down and are fine. For sitting and standing poses I recommend googling the exercise to make sure it’s shown correctly. For example, google ‘tree pose’ and compare the pictures to what’s in the book.

Overall they’re both good books despite the problems. The key thing is to listen to your body.  Everything that is offered may not work for you so you need to experiment a bit. This includes working with your mind, which definitely has an effect on pain and how you deal with it.

Computers and Back Care part 2: Forward Bending

Like Digital Anarchy On Facebook

 

Go to Part 1 in the Back Care series

Most folks know how to pick up a heavy box. Squat down, keep your back reasonably flat and upright and use your legs to lift.

However, most folks do not know how to plug in a power cord. (as the below photo shows)

How to bend forward if you're plugging in a power cord

Forward bending puts a great deal of stress on your back and we do it hundreds of times a day. Picking up your keys, putting your socks on, plugging in a power cord, and on and on. This is why people frequently throw their backs out sneezing or picking up some insignificant thing off the floor like keys or clothing.

While normally these don’t cause much trouble, the hundreds of bends a day add up. Especially if you sit in a chair all day and are beating up your back with a bad chair or bad posture. Over time all of it weakens your back, degrades discs, and causes back pain.

So what to do?

There are a couple books I can recommend. Both have some minor issues but overall they’re very good. I’ll talk about them in detail in Part 3 of this series.

Back RX by Vijay Vad
8 Steps To a Pain Free Back by Esther Gokhale

Obviously for heavy objects, keep doing what you’re probably already doing: use your legs to lift.

But you also want to use your legs to pick up almost any object. Using the same technique to pick up small objects works as well. That said, all the squatting can be a bit tough on the knees, so lets talk about hip hinging.

Woman hinging from the hips in a way that puts less pressure on your back(the image shows a woman stretching but she’s doing it with a good hip hinge. Since it’s a stretch, it’s, uh, a bit more exaggerated than you’d do picking something up. Not a perfect image for this post, but we’ll roll with it.)

Imagine your hip as a door hinge. Your upright back as the door and your legs as the wall. Keep your back mostly flat and hinge at the hips. Tilting your pelvis instead of bending your back. Then bend your legs to get the rest of the way to the floor. This puts less strain on your back and not as much strain on your knees as going into a full squat. Also, part of it is to engage your abs as you’re hinging. Strong abs help maintain a strong back.

Directions on how to hip hinge, showing a good posture

There’s some disagreement on the best way to do this. Some say bend forward (with your knees slightly bent) until you feel a stretch in your hamstrings, then bend your knees. I usually hinge the back and bend the knees at the same time. This feels better for my body, but everyone is different so try it both ways. There is some truth that the more length you have in your hamstrings, the more you can hinge. However, since most people, especially those that sit a lot, have tight hamstrings, it’s just easier to hinge and bend at the same time.

But the really important bit is to be mindful of when you’re bending, regardless of how you do it. Your back isn’t going to break just from some forward bending, but the more you’re aware of how often you bend and doing it correctly as often as possible, the better off you’ll be.

This also applies to just doing regular work, say fixing a faucet or something where you have to be lower to the ground. If you can squat and keep a flat back instead of bending over to do the work, you’ll also be better off.

If this is totally new to you, then your back may feel a little sore as you use muscles you aren’t used to using. This is normal and should go away. However, it’s always good to check in with your doctor and/or physical therapist when doing anything related to posture.

In Part 3 I’ll discuss the books I mentioned above and some other resources for exercises and programs.

Taking Care of Your Back for Video Editors, Part 1: The Chair

Like Digital Anarchy On Facebook

 

Software developers, like video editors, sit a lot. I’ve written before about my challenges with Repetitive  Stress Problems and how I dealt with them. (Awesome chair, great ergonomics, and a Wacom tablet). These problems are more about my wrists, shoulders, and neck.

I fully admit to ignoring everyone’s advice about sitting properly and otherwise taking care of my back, so I expect you’ll probably igrnore this (unless you already have back pain). But you shouldn’t. And maybe some of you will listen and get some tips to help you avoid having to take a daily diet of pain meds just to get through a video edit.

Video editors need good posture

I’ve also always had problems with my back. The first time I threw it out I was 28, playing basketball. Then add in being physically active in a variety of other ways… martial arts, snowboarding, yoga, etc… my back has taken some beatings over the years. And then you factor in working at a job for the last 20 years that has me sitting a lot.

And not sitting very well for most of those 20 years. Hunched over a keyboard and slouching in your chair at the same time is a great way of beating the hell out of your back and the rest of your body. But that was me.

So, after a lot of pain and an MRI showing a couple degraded discs, I’m finally taking my back seriously. This is the first of several blog posts detailing some of the things I’ve learned and what I’m doing for my back. I figure it might help some of you all.

I’ll start with the most obvious thing: Your chair. Not only your chair BUT SITTING UPRIGHT IN IT. It doesn’t help you to have a $1000 chair if you’re going to slouch in it. (which I’m known to be guilty of)

A fully adjustable chair can help video editors reduce back pain

The key thing about the chair is that it’s adjustable in as many ways as possible. This way you can set it up perfectly for your body, which is key. Personally, I have a Steelcase chair which I like, but most high end chairs are very configurable and come in different sizes. (I’m not sure the ‘ball chair’ is going to be good for video editing, but some people love them for normal office work) There are also adjustable standing desks, which allow you to alternate between sitting and standing, which is great. Being in any single position for too long is stressful on your body.

The other key thing is your posture. Actually sitting in the chair correctly. There are slightly different opinions  on what is precisely the best sitting posture (see Part 3 for more on this), but generally, the illustration below is a good upright position. Feet on the ground, knees at right angles, butt all the way back with some spine curvature, but not too much, the shoulders slightly back and the head above the shoulders (not forward as we often do, which puts a lot of strain on the neck. If you keep leaning in to see your monitor, get glasses or move the monitor closer!).

It can also help to have your abdominal muscle engaged to prevent to much curvature in the spine. This can be a little bit of work, but if you’re paying attention to your posture, then it should just come naturally as you maintain the upright position.

You want to sit upright in your chair for good back healthThere’s a little bit of disagreement on how much curvature you should have while sitting. Some folks recommend even less than what you see above. We’ll talk more about it in Part 3.

One other important thing is to take breaks, either walk around or stretch. Sitting for long periods really puts a lot of stress on your discs and is somewhat unnatural for your body, as your ancestors probably weren’t doing a lot of chair sitting. Getting up to walk, do a midday yoga class, or just doing a little stretching every 45 minutes or so will make a big difference. This is one of the reasons a standing desk is helpful.

So that’s it for part 1. Get yourself a good chair and learn how to sit in it! It’ll greatly help you keep a healthy, happy back.

In Part 2 we’ll discuss picking up your keys, sneezing, and other dangers to back health lurking in plain sight.

We Live in A Tron Universe: NASA, Long Exposure Photography and the Int’l Space Station

Like Digital Anarchy On Facebook

 

I’m a big fan of long exposure photography (and time lapse, and slow motion, etc. etc. :-). I’ve done some star trail photography from the top of Haleakala in Maui. 10,000 feet up on a rock in the middle of the Pacific is a good place for it! So I was pretty blown away by some of the images released by NASA that were shot by astronaut Don Pettit.

Long Exposure photos of star trails from spaceI think these have been up for a while, they were shot in 2012, but it’s the first I’ve seen of them. Absolutely beautiful imagery. Although they make the universe look like the TRON universe. These were all shot with 30 second exposures and then combined together, as Don says:

“My star trail images are made by taking a time exposure of about 10 to 15 minutes. However, with modern digital cameras, 30 seconds is about the longest exposure possible, due to electronic detector noise effectively snowing out the image. To achieve the longer exposures I do what many amateur astronomers do. I take multiple 30-second exposures, then ‘stack’ them using imaging software, thus producing the longer exposure.”

You can see the entire 36 photo set on Flickr.

Having done long exposures myself that were 10 or 15 minutes, the images are noisy but not that bad. I wonder if being in space causes the camera sensors to pick up more noise. If anyone knows, feel free to leave a comment.

If you’re stuck doing star photography from good ol’ planet Earth, then noise reduction software helps. You also want to shoot RAW as most RAW software will automatically remove dead pixels. These are particularly annoying with astro photography.

But the space station photos are really amazing, so head over to Flickr and check them out! These are not totally public domain, they can’t be used commercially, but you can download the high res versions of the photos and print or share them as you see fit. Here’s a few more to wet your appetite:

The shots were created in Photoshop by combining multiple 30 second exposure photosAmazing TRON like photos taken from the space station

The Problem of Slow Motion Flicker during Big Sporting Events: NCAA Tournament

Like Digital Anarchy On Facebook

 

Shooting slow motion footage, especially very high speed shots like 240fps or 480fps, results in flicker if you don’t have high quality lights. Stadiums often have low quality industrial lighting, LEDs, or both. Resulting in flicker during slow motion shots even on nationally broadcast, high profile sporting events.

I was particularly struck by this watching the NCAA Basketball Tournament this weekend. Seemed like I was seeing flicker on  half of the slow motion shots. You can see a few in this video (along with Flicker Free plugin de-flickered versions of the same footage):

To see how to get rid of the flicker you can check out our tutorial on removing flicker from slow motion sports.

The LED lights are most often the problem. They circle the arena and depending on how bright they are, for example if it’s turned solid white, they can cast enough light on the players to cause flicker when played back in slow motion. Even if they don’t cast light on the players they’re visible in the background flickering. Here’s a photo of the lights I’m talking about in Oracle arena (white band of light going around the stadium):

Deflickering stadium lights can be done with Flicker Free

While Flicker Free won’t work for live production, it works great for de-flickering this type of flicker if you can render it in a video editing app, as you can see in the original example.

It’s a common problem even for pro sports or high profile sporting events (once you start looking for it, you see it a lot). So if you run into with your footage, check out the Flicker Free plugin for most video editing applications!

Tips on Photographing Sports – Sneaking a Lens In and Other Stories

Like Digital Anarchy On Facebook

 

I love photographing sports. It’s a lot like shooting wildlife/Humpback Whales in many ways. It requires a lot of patience and quick shooting skills.

Unfortunately, I’m usually limited to shooting from the stands. So this makes the process a little harder but if you can get good seats you can make it work. As it happens, I recently got third row seats to the Golden State Warriors game against the Lakers. So here are a few tips for getting great shots if you can’t actually get a press pass.

Depth of field is always important when photographing sports

The first thing you need to check is how long of a lens you’re allowed to bring in. In this case it was a 3″ or less. So that’s what needs to be attached to the camera. (see the end of the article for some ‘other’ suggestions)

I ended up using a 100mm f2 lens for these shots, which is exactly 3″. You want as fast of a lens as possible. You’re not going to be able to use a flash, so you’re reliant on the stadium lighting which isn’t particularly bright. f2.8 is really a minimum and even then you’ll have the ISO higher than you’d like. Like wildlife, the action moves fast, so the wider the aperture, the faster the shutter speed you’ll have, and the sharper the shots will be.

The minimum shutter speed is probably about 1/500 and you’d like 1/2000 or higher. Hence the need for a f2 or f2.8 lens. Otherwise, the action shots, where you really want it to be sharp, will be a bit blurry.

Seat placement matters. Obviously you want to be as close as possible, but you also want to be at the ends of the court/field. That’s where most of the action happens. Center court seats may be great for watching the game, but behind the goal seats get you up close and personal for half of the action. Much better for photography and hence one of the reasons the press photogs are on the baseline.

Photographing basketball is best from the baseline

What if you’re not happy with a 3″ lens? Well, you COULD give a friend a larger lens and let them try and smuggle it in. Since it’s not attached to the camera, most of the security people don’t recognize it as a camera lens. Just say it’s, you know, a binocular or something (monocular? ;-). Usually it works, worst thing that happens is you have to go back to the car and store it. You’re not trying to break the rules, you’re, uh, helping train the security staff.

If you do manage to get a larger lens in, don’t expect to be able to use it much. One of the ushers will eventually spot it (especially if it’s a big, white, L Canon lens) and call you on it. You’ll have to swap it for the other lens (or risk getting kicked out). Wait until the game is well underway before trying to use it.

Of course, the basic tips apply… Shoot RAW, make sure you have a large, empty memory card(s), a fully charged battery, don’t spill beer on the camera, etc., etc. But the critical component is getting close to  the end of the court and having a very fast shutter speed (which usually means a very wide aperture).

Shooting RAW is soooo critical. It’ll give you some flexibility to adjust the exposure and do some sharpening. Since you’ll probably have a relatively high ISO, the noise reduction capabilities are important as well. Always shoot RAW.

If you’re a photographer that loves sports, it is definitely fun to get good seats and work on your sports shooting skills. Can be a bit expensive to do on a regular basis though!

Fast Shutter Speed and very wide aperture is critical for shooting sports

 

Tips on Photographing Whales – Underwater and Above

Like Digital Anarchy On Facebook

 

I’ve spent the last 7 years going out to Maui during the winter to photograph whales. Hawaii is the migration destination of the North Pacific Humpback Whales. Over the course of four months, it’s estimated that about 12,000 whales migrate from Alaska to Hawaii. During the peak months Jan 15 – March 15th or so, there’s probably about 6000+ whales around Hawaii. This creates a really awesome opportunity to photograph them as they are EVERYWHERE.

Many of the boats that go out are small, zodiac type boats. This allows you to hang over the side if you’ve got an underwater camera. Very cool if they come up to the boat, as this picture shows! (you can’t dive with them as it’s a national sanctuary for the whales)

A photographer can hang over the side of a boat to get underwater photos of the humpback whales.

The result is shots like this below the water:

Photographing whales underwater is usually done hanging over the side of a boat.

Or above the water:

A beautiful shot of a whale breaching in Maui

So ya wanna be whale paparazzi? Here are a few tips on getting great photographs of whales:

1- Patience: Most of the time the whales are below the water surface and out of range of an underwater camera. There’s a lot of ‘whale waiting’ going on. It may take quite a few trips before a whale gets close enough to shoot underwater. To capture the above the water activity you really need to pay attention. Frequently it happens very quickly and is over before you can even get your camera up if you’re distracted by talking or looking at photos on your camera. Stay present and focused.

2- Aperture Priority mode: Both above and below the water I set the camera to Aperture Priority and set the lowest aperture I can, getting it as wide open as possible. You want as fast of a shutter speed as possible (for 50 ton animals they can move FAST!) and setting it to the widest aperture will do that. You also want that nice depth of field a low fstop will give you.

3- AutoFocus: You have to have autofocus turned on. The action happens to fast to manually focus. Also, use AF points that are calculated in both the horizontal and vertical axes. Not all AF points are created the same.

4- Lenses: For above the water, the 100mm-400mm is a good lens for the distance the boats usually tend to stay from the whales. It’s not great if the whales come right up to the boat… but that’ s when you bust out your underwater camera with a very wide angle or fisheye lens. With underwater photography, at least in Maui, you can only photograph the whales if they come close to the boat.  You’re not going to be able to operate a zoom lens hanging over the side of a boat. So set a pretty wide focal length when you put it into the housing. I’ve got a 12-17mm Tokina fisheye and usually set it to about 14mm. This means the whale has to be within about 10 feet of the boat to get a good shot. But due to underwater visibility, that’s pretty much the case no matter what lens you have on the camera.

5- Burst Shooting: Make sure you set the camera to burst mode. The more photos the camera can take when you press and hold the shutter button the better.

6- Luck: You need a lot of luck. But part of luck is being prepared to take advantage of the opportunities that come up. So if you get a whale that’s breaching over and over, stay focused with your camera ready because you don’t know where he’s going to come up. Or if a whale comes up to the boat make sure that underwater camera is ready with a fully charged battery, big, empty flash card and you know how to use the controls on the housing. (trust me… most of these tips were learned the hard way)

Many whale watches will mostly be comprised of ‘whale waiting’. But if you stay present and your gear is set up correctly, you’ll be in great shape to capture those moments when you’re almost touched by a whale!

Whale photographed that was just out of arms reach. The whale is just about touching the camera.

Avoiding Prop Flicker when Shooting Drone Video Footage

Like Digital Anarchy On Facebook

 

We released a new tutorial showing how to remove prop flicker, so if you have flicker problems on drone footage, check that out. (It’s also at the bottom of this post)

But what if you want to avoid prop flicker altogether? Here’s a few tips:

But first, let’s take a look at what it is. Here’s an example video:

1- Don’t shoot in such a way that the propellers are between the sun and the camera. The reason prop flicker happens is the props are casting shadows onto the lens. If the sun is above and in front of the lens, that’s where you’ll get the shadows and the flicker. (shooting sunrise or sunset is fine because the sun is below the props)

1b- Turning the camera just slightly from the angle generating the flicker will often get rid of the flicker. You can see this in the tutorial below on removing the flicker.

2- Keep the camera pointed down slightly. It’s more likely to catch the shadows if it’s pointing straight out from the drone at 90 degrees (parallel to the props). Tilt it down a bit, 10 or 20 degrees, and that helps a lot.

3- I’ve seen lens hoods for the cameras. Sounds like they help, but I haven’t personally tried one.

Unfortunately sometimes you have to shoot something in such a way that you can’t avoid the prop flicker. In which cases using a plugin like Flicker Free allows you to eliminate or reduce the flicker problem. You can see how to deflicker videos with prop flicker in the below tutorial.

Removing Flicker from Drone Video Footage caused by Prop Flicker

Like Digital Anarchy On Facebook

 

Drones are all the rage at the moment, deservedly so as some of the images and footage  being shot with them are amazing.

However, one problem that occurs is that if the drone is shooting with the camera at the right angle to the sun, shadows from the props cause flickering in the video footage. This can be a huge problem, making the video unusable. It turns out that our Flicker Free plugin is able to do a good job of removing or significantly reducing this problem. (of course, this forced us to go out and get one. Research, nothing but research!)

Here’s an example video showing exactly what prop flicker is and why it happens:

There are ways around getting the flicker in the first place: Don’t shoot into the sun, have the camera pointing down, etc. However, sometimes you’re not able to shoot with ideal conditions and you end up with flicker.

Our latest tutorial goes over how to solve the prop flicker issue with our Flicker Free plugin. The technique works in After Effects, Final Cut Pro, Avid, Resolve, etc. However the tutorial shows Flicker Free being used in Premiere Pro.

The full tutorial is below. You can even download the original flickering drone video footage and AE/Premiere project files by clicking here.

Speeding Up Flicker Free: The Order You Apply Plugins in Your Video Editing App

Like Digital Anarchy On Facebook

 

One key way of speeding up the Flicker Free plugin is putting it first in the order of effects. What does this mean? Let’s say you’re using the Lumetri Color Corrector in Premiere. You want to apply Flicker Free first, then apply Lumetri. You’ll see about a 300+% speed increase vs. doing it with Lumetri first. So it looks like this:

Apply Flicker Free first in your video editing application to increase the rendering speed.

Why the Speed Difference?

Flicker Free has to analyze multiple frames to de-flicker the footage you’re using. It looks at up to 21 frames. If you have the effect applied before Flicker Free it means Lumetri is being applied TWENTY ONE times for every frame Flicker Free renders. And especially with a slow effect like Lumetri that will definitely slow everything down.

It fact, on slower machines it can bring Premiere to a grinding halt. Premiere has to render the other effect on 21 frames in order to render just one frame for Flicker Free. In this case, Flicker Free takes up a lot of memory, the other effect can take up a lot of memory and things start getting ugly fast.

Renders with Happy Endings

So to avoid this problem, just apply Flicker Free before any other effects. This goes for pretty much every video editing app. The render penalty will vary depending on the host app and what effect(s) you have applied. For example, using the Fast Color Corrector in Premiere Pro resulted in a slow down of only about 10% (vs. Lumetri and a slow down of 320%). In After Effects the slow down was about 20% with just the Synthetic Aperture color corrector that ships with AE. However, if you add more filters it can get a lot worse.

Either way, you’ll have much happier render times if you put Flicker Free first.

Hopefully this makes some sense. I’ll go into a few technical details for those that are interested. (Feel free to stop reading if it’s clear you just need to put Flicker Free first) (oh, and here are some other ways of speeding up Flicker Free)

Technical Details

With all host applications, Flicker Free, like all plugins, has to request frames through the host application API. With most plugins, like the Beauty Box Video plugin, the plugin only needs to request the current frame. You want to render frame X: Premiere Pro (or Avid, FCP, etc) has to load the frame, render any plugins and then display it. Plugins get rendered in the order you apply them. Fairly straightforward.

The Flicker Free plugin is different. It’s not JUST looking at the current frame. In order to figure out the correct luminance for each pixel (thus removing flicker) it has to look at pixels both before and after the current frame. This means it has to ask the API for up to 21 frames, analyze them, return the result to Premiere, which then finishes rendering the current frame.

So the API says, “Yes, I will do your bidding and get those 21 frames. But first, I must render them!”. And so it does. If there are no plugins applied to them, this is easy. It just hands Flicker Free the 21 original frames and goes on its merry way. If there are plugins applied, the API has to render those on each frame it gives to Flicker Free. FF has to wait around for all 21 frames to be rendered before it can render the current frame. It waits, therefore that means YOU wait. If you need a long coffee break these renders can be great. If not, they are frustrating.

If you use After Effects you may be familiar with pre-comping a layer with effects so that you can use it within a plugin applied to a different layer. This goes through a different portion of the API than when a plugin requests frames programmatically from AE. In the case of a layer in the layer pop-up the plugin just gets the original image with no effects applied. If the plugin actually asks AE for the frame one frame before it, AE has to render it.

One other thing that affects speed behind the scenes… some apps are better at caching frames that plugins ask for than other apps. After Effects does this pretty well, Premiere Pro less so. So this helps AE have faster render times when using Flicker Free and rendering sequentially. If you’re jumping around the timeline then this matters less.

Hopefully this helps you get better render times from Flicker Free. The KEY thing to remember however, is ALWAYS APPLY FLICKER FREE FIRST!

Happy Rendering!

Beauty Work for Corporate Video

Like Digital Anarchy On Facebook

 

We love to talk about how Beauty Box Video is used on feature films by the likes of Local Hero Post and Park Road Post Production  or broadcast TV by NBC or Fox. That’s the big, sexy stuff.

However, many, if not most, of our customers are like Brian Smith. Using Beauty Box for corporate clients or local commercials. They might not be winning Emmy awards for their work but they’re still producing great videos with, usually, limited budgets.   “The time and budget does not usually afford us the ability to bring in a makeup artist.  People that aren’t used to being on camera are often very self-conscious, and they cringe at the thought of every wrinkle or imperfection detracting from their message.”, said Brian, Founder of Ideaship Studios in Tulsa, OK. “Beauty Box has become a critical part of our Final Cut X pipeline because it solves a problem, it’s blazing fast, and it helps give my clients and on-camera talent confidence.  They are thrilled with the end result, and that leads to more business for us.”

An Essential Tool for Beauty Work and Retouching

Beauty Box Video has become an essential tool at many small production houses or in-house video departments to retouch makeup-less/bad lighting situations and still end up with a great looking production. The ability to quickly retouch skin with an automatic mask without needing to go frame by frame is important. However, it’s usually the quality of retouching that Beauty Box provides that’s the main selling point.

Example of Brian Smith's skin retouching for a corporate clientimage courtesy of Ideaship Studios

Beauty Box goes beyond just blurring skin tones. We strive to keep the skin texture and not just mush it up. You want to have the effect of the skin looking like skin, not plastic, which is important for beauty work. Taking a few years off talent and offsetting the harshness that HD/4K and video lights can add to someone. The above image of one of Brian’s clients is a good example.

When viewed at full resolution, the wrinkles are softened but not obliterated. The skin is smoothed but still shows pores. The effect is really that of digital makeup, as if you actually had a makeup artist to begin with. You can see this below in the closeup of the two images. Of course, the video compression in the original already has reduced the detail in the skin, but Beauty Box does a nice job of retaining much of what is there.

Closeup of the skin texture retained by Beauty Box

” On the above image, we did not shoot her to look her best. The key light was a bit too harsh, creating shadows and bringing out the lines.  I applied the Beauty Box Video plugin, and the shots were immediately better by an order of magnitude.  This was just after simply applying the plugin.  A few minutes of tweaking the mask color range and effects sliders really dialed in a fantastic look. I don’t like the idea of hiding flaws.  They are a natural and beautiful part of every person.  However, I’ve come to realize that bringing out the true essence of a person or performance is about accentuating, not hiding.  Beauty Box is a great tool for doing that.” – Brian Smith

Go for Natural Retouching

Of course, you can go too far with it, as with anything. So some skill and restraint is often needed to get the effect of regular makeup and not making the subject look ‘plastic’ or blurred. As Brain says, you want things to look natural.

However, when used appropriately you can get some amazing results, making for happy clients and easing the concerns of folks that aren’t always in front of a camera. (particularly men, since they tend to not want to wear makeup… and don’t realize how much they need it until they see themselves on a 65″ 4K screen. ;-)

One last tip, you can often easily improve the look of Beauty Box even more by using tracking masks for beauty work, as you can see in the tutorials that link goes to. The ability of these masks to automatically track the points that make up the mask and move them as your subject moves is a huge deal for beauty work. It makes it much easier to isolate an area like a cheek or the forehead, just as a makeup artist would.

Live Video Streaming on the Cheap

Like Digital Anarchy On Facebook

 

I’ve been live streaming various events for small organizations for a while. Most recently for the Against the Stream Meditation Center in SF (if you’re into mindfulness meditation and dharma talks check it out on Monday’s at 7:30pm PST).

Meditation centers don’t usually have a ton of money so we needed to figure out how to do things relatively cheaply. In the past I’ve used Ustream for other organizations, but they’re expensive. Especially for a non-profit.

Note: This assumes you want to do a relatively professional looking stream. If you’re just looking to stream you playing a video game or something there are even cheaper ways to do it. This article doesn’t cover that though. (although YouTube is still a great choice for that)

Looking around for alternatives I discovered that YouTube now does live streaming. For free. I’m usually skeptical of free offerings but they have been fantastic. Quality, analytics, control have all been on par with what I’m used to with Ustream, if not better. If you want to put up paid content then Ustream has some advantages but it’s very expensive. If you’re just trying to live stream your user group, meditation, or whatever meeting then I highly recommend taking a look at YouTube. (in YouTube go to Video Manager and then select Live Streaming. See image below)

So that takes care of one big component, the delivery network. YouTube: FREE  (YouTube records everything and then posts it after you’re done. If you’re doing test streams, make sure you turn that function off. Had an mildly embarrassing test video get posted accidentally)

Digital Anarchy's YouTube Live Stream Dashboard

Let’s talk about software.

Currently I use Wirecast Studio. This is great software for streaming productions. You can have overlays, animated lower thirds, multiple cameras, multiple audio streams, etc. It’s really a great live production environment and works with virtually every CDN (content delivery network). It’s also $500. If you’re doing a serious production it’s worth it though.

HOWEVER, with YouTube you can get Wirecast Play for free. Yep, once again FREE. This is a bit more limited, you can only have one camera (which is just fine for most small events), there’s no preview (whatever you select immediately goes live) and it only works with YouTube. However, if you’re only streaming on YouTube… not really a problem. It’s FREE and does have many of the features of it’s big, $500 brother. It also only works with Black Magic capture cards, so that’s another potential big limitation of Play. (see further down for the hardware I recommend)

So, software: FREE  (you can buy Wirecast Play Studio for $279. This is the $500 app restricted to YouTube. If you can afford it, I highly recommend this. The ability to Preview is kind of a big deal. But play with the free version and see how that works for you.) A screenshot of Wirecast Studio is below. As you can see below it looks much like a normal video editing app, so it’s very intuitive. (and yes, if they ever allow plugins we’d love to port Beauty Box Video to it :-)

Wirecast interface for video live streaming

And now we get to Hardware. Hardware is not usually free unfortunately. So this is where the expenses come in. However you’ll see it won’t be too bad, other than maybe the computer.

You’ll need a computer. I sort of recommend MacBook Pros. Macs still handle audio/video stuff with less problems than their PC counterparts. Apple unfortunately isn’t very generous with the hardware though, so small disks and limited RAM can create bottlenecks. If you don’t need the machine to be portable and can use a desktop machine, you’re better off. If you’re pretty tech savvy, then a PC is fine. Just realize they can be more finicky when it comes to getting video in. (although Windows is getting better)

I’ll go over various problems with the software and hardware in part 2 of this. There are lots of quirks to getting video into a computer and getting it to spit it out to the interwebs, so it needs it’s own blog post. The MacBook Pro has less quirks, so that’s the machine of choice for me.

Internet Connection: You have to have pretty fast internet (bare minimum is 1megabit Upload speed) and expect to use a wired connection. Do NOT use Wifi. Wifi is relatively unstable, slower and you’re much more likely to have problems. Run a cable directly into your modem or router. It will definitely help if other people are not using the connection. Having someone start watching Netflix while you’re trying to stream will not go well.

Having your connection constantly dropping really sucks and makes for a lot of stress. Get a fast connection, wire straight from your modem to the computer and kick everyone else off the line. Much more likely to have a good stream for your viewers and less stress for you.

Internet Connection: $50/mo (give or take)

Computer: MacBook Pro for $1600 (If you can use a desktop Mac/PC or laptop PC then the cost will be much less. I recommend buying the other bits first and trying everything out with an existing computer. If you can use your existing computer so much the better.)

Video Capture Card: Blackmagic Design Intensity Shuttle: This is what I use. It works well with the Mac but requires Thunderbolt. It took a bit of time to get it set up and find the right setting to get it to work with the Panasonic camcorder I use.  From my experience and most accounts, it’s very finicky. Expect to spend some time setting it up and possibly calling Black Magic (who were very helpful and got me up and running but other folks have reported having less positive experiences).

Frankly there aren’t a lot of other good, inexpensive solutions. So even though it’s not perfect, once you get it set up, it does work and will only set you back about $230. There’s also a USB 3.0 version which I have not tried.

Video Capture Card: $230 Blackmagic Design Intensity Shuttle Thunderbolt

Video Camera: I’ve been using Panasonic camcorders but any camcorder with a HDMI port will work. I know the correct settings on the Black Magic Intensity (1080i59.94) that goes with the Panasonics (video quality: 1080HG) so that’s what I stick with. But with a little experimentation I’m sure you can figure out the settings for any HD camcorder with HDMI out.

Video Camera: $150 Panasonic HC-V160 Camcorder

I’ve been using the Panasonic 4K camcorder but that’s just cuz I’m a geek. It’s a great camera and certainly works well, but total overkill if all you’re doing is streaming. Just get a basic HD camcorder.

Microphone: If you’re doing this on the cheap, just use the camcorder microphone. Easy and usually sounds ok. I’ll probably do another blog post on audio. There are lots of options and not easily covered in a couple paragraphs. Using the Camcorder Mic will be Free and easy. It won’t sound _amazing_ but should work. One advantage of the on camera mic is that it’s great for picking up the audience. Even if I have the speaker mic’d up, I’ll switch to the camcorder mic if someone in the audience is speaking (if there’s no audience mic). (This is one instance where Wirecast Studio is preferable to Wirecast Play)

Cables: You’ll need an ethernet cable and hdmi cable or two. Buy them from Amazon, they’re cheap and work great. Cost: $20-30 or so. Make sure you figure out what length you need. You may not be next to the modem so a long ethernet cable may be necessary. The longest HDMI cable I’ve been able to use is 12 feet. Seems cameras don’t have as strong of a HDMI signal as TVs and can’t use very long cables. Make sure you test everything well in advance of your event.

Actually, let me say that again: Make sure you test everything well in advance of your event. Streaming is quirky and you need to have confidence all you components and cables will play nicely together.

So the bottom line is. assuming you have the computer and a decent internet connection:

Hardware: $410 : $230 for the Intensity Shuttle, $150 for the camera and $30 for cables.

Service: FREE : YouTube live streaming

Streaming Software: FREE (or $279) : Wirecast Play or Wirecast Play Studio

 

Removing Flicker from Stadium Lights in Slow Motion Football Video

Like Digital Anarchy On Facebook
One common problem you see a lot is flickering from stadium lights when football or other sports are played back in slow motion. You’ll even see it during the NFL football playoffs. Stadium lights tend to be low quality lights and the brightness fluctuates. You can’t see it normally, but play video back at 240fps… and flicker is everywhere.

Aaron at Griffin Wing Video Productions ran into this problem shooting video of the high school football championship at the North Carolina State stadium. It was a night game and he got some great slomo shots shooting with the Sony FS700, but a ton of flicker from the stadium lights.

Let’s take a look at a couple of his examples and break down how our Flicker Free plugin fixed the problem for him.

First example is just a player turning his head as he gazes down on the field. There’s not a lot of fast movement so this is relatively easy. Here are the Flicker Free plugin parameters from within After Effects (although it works the same if you’re using Premiere, FCP, Avid, etc.)

Video Footage of Football Player with Flickering LightsNotice that ‘Detect Motion’ is turned off and the settings for Sensitivity and Time Radius. Well discuss those in a moment.

Here’s a second example of a wide receiver catching the football. Here there’s a lot more action (even in slow motion), so the plugin needs different settings to compensate for that motion. Here’s the before/after video footage:

Here are the Flicker Free plugin settings:

Football player catching ball under flickering lights

So, what’s going on? You’ll notice that Detect Motion is off. Detect Motion tries to eliminate the ghosting (see below for an example) that can happen when removing flicker from a bunch of frames. (FF analyzes multiple frames to find the correct luminance for each pixel. But ghosts or trails can appear if the pixel is moving) Unfortunately it also reduces the flicker removal capabilities. The video footage we have of the football team has some pretty serious flicker so we need Detect Motion off.

With Detect Motion off we need to worry about ghosting. This means we need to reduce the Time Radius to a relatively low value.

Time Radius tells Flicker Free how many frames to look at before and after the current frame. So if it’s set to 5, it’ll analyze 11 frames: the current frame, 5 before it, and 5 after it. The more frames you analyze, the greater the chance objects will have moved in other frames… resulting in ghosting.

With the player looking our the window, there’s not a lot of motion. Just the turning of his head. So we can get away with a Time Radius of 5 and a Sensitivity of 3. (More about Sensitivity in a moment)

The video with the receiver catching the ball has a LOT more motion. Each frame is very different from the next. So there’s a good chance of ghosting. Here we’ve set Time Radius to 3, so it’s analyzing a total of 7 frames, and set Sensitivity to 10. A Time Radius of 3 is about as low as you can realistically go. In this case it works and the flicker is gone. (As you can see in the above video)

Here’s an example of the WRONG settings and what ‘ghosting’ looks like:

Blurry Video Caused by incorrect Flicker Free settings

Sensitivity is, more or less,  how large of an area the Flicker Free plugin analyzes. Usually I start with a low value like 3 and increase it to find a value that works best. Frequently a setting of 3 works as lower values reduce the flicker more. However, low values can result in more ghosting, so if you have a lot of motion sometimes 5 or 10 works better. For the player turning his head, three was fine. For the receiver we needed to increase it to 10.

So that’s a breakdown of how to get rid of flicker from stadium lights! Thanks to Aaron at Griffin Wing Video Productions for the footage. You can see their final video documenting the High School Football Championship on YouTube.

And you can also view more Flicker Free tutorials if you need additional info on how to get the most out of the Flicker Free plugin in After Effects, Premiere Pro, Final Cut Pro, Avid, or Resolve.

Don’t Go To Art School, Especially for Video/Film/VFX

Like Digital Anarchy On Facebook

 

I’ve written about this before, but Forbes recently wrote a couple awesome pieces taking down San Francisco’s Academy of Art, really spelling out why for-profit art schools are such an overpriced scam. And they are.

Rule #1: Don’t go into massive debt to get an art degree

The ‘Starving Artist’ is a thing. Don’t compound it with debt.

For-profit schools will promise you anything to get you to take out a federally backed student loan. You can’t bankrupt yourself out of that loan so it’s guaranteed money to the school. They could care less if you succeed. They will certainly promote those few students that do succeed in a big way, but most end up like our former admin assistant:  Academy of Art Photography degree, a ton of debt, and a $15/hr job as an admin assistant.

And those that are successful, would be successful anywhere because they have the right mix of work ethic, skills, and talent. Especially the work ethic.

There are amazing instructors at even community colleges. I’m going to do another post soon profiling Community College of San Francisco and their excellent broadcast department with a great studio. Full switcher and control room, 4K cameras, greenscreen and all of it.  Misha Antonich, the head of the department, has set up a great program for all things broadcast. We hired our QA/Tech Support guy out of there. (Tor, who some of you have probably talked with)

So don’t get caught up in the supposed ‘prestige’ (i.e. marketing budget) of a for-profit school or other expensive school. It’s an illusion. Expensive tuition does not mean better results. You’ll do just fine at a community or state college. Ultimately, it’s your work ethic and demo reel that will make you successful.

Rule #2: Work ethic and internships

You’ll learn more in 3 months of an internship than a year in school. It’s also something that will stand out on your resume MUCH more than where you went to school. Make it happen.

The jobs you’ve had are what sells you. Spending $100K on a filmmaking or VFX degree is usually just a good way to get entry level jobs. There are much cheaper ways to get entry level jobs.

To get internships (and entry level jobs), you’ll need to do a lot of work on your own. But if you’re really into editing, vfx, or whatever this should be something you WANT to do. You should be totally into the type of work you’re trying to get. If you’re working on a personal project and you look up and realize it’s 4am because you’ve completely lost track of time because you’re so into what you’re doing that the time flys by…. that’s a really good indication you’re doing the right thing.

So dig through as many online tutorials as you can, do lots of personal projects, get together with other students and do cool stuff. It’ll all get you to the point of having a reel you can use to get internships.

One caveat: Just because someone is teaching it, doesn’t mean they’re right. With editing or visual effects there’s usually 10 different ways of doing anything and they’re all correct depending on the situation. For example, you’ll find the occasional colorist throwing an online hissy fit over digital beauty work using Beauty Box because they think it’s putting beauty artists out of work (yes, I’ve actually had an online argument about this) or it’s not true beauty work or whatever. However, you can use Beauty Box in many workflows and we have many excellent colorists that use Beauty Box for beauty work on feature films, high end music videos and national commercials. But some folks have _their_ way of doing something and feel that’s the only way. Don’t be like that. Be flexible and you’ll be a better artist (not to mention being able to work with different time/budget constraints).

Rule #3: Networking and self promotion

The other benefit of internships is you get to meet people. This is critical.

Of course, there are many other ways of meeting people. Go to user groups, join professional meetups, anything where you can meet folks that are doing what you want to do. It’s a good way to get internships, jobs, and good advice.

And you need to promote yourself. Most artists don’t get into doing art because they enjoy sales, but that is the business side to the industry. You need to talk about yourself or, at least, what you’ve been doing. Make sure you have a business card, a web site with your demo reel on it, and examples of your work on your phone.

The business side is every bit as important as your work when it comes to being successful. ALL schools tend to gloss over this. Art majors don’t want to take business classes. If you’re going to succeed, it’s critical that you understand the business side.

Rule #4: Persistence

Don’t give up and definitely follow up. If someone introduces you to someone that has a job/intern opening, follow up with them. Make sure they know you’re interested. Ask them if they need any additional information and don’t be afraid to ask for an interview. People want to hire folks that are proactive and show a willingness to make an effort. It matters. A lot.

Even if there’s not a job involved, most people are willing to help you. But you have to be proactive about it. Don’t be annoying, but if you’ve interacted with them and gotten their card don’t be afraid to send them the occasional email updating them on new projects or things you’ve completed.

So skip the high priced art school. Go to a community college or state college, go through every tutorial you can online, meet folks, do your own projects, get internships, and meet as many people as you can. That’s how you get the skills and contacts that will make you successful. Just get out there and do it. Get an entry level job (you’re going to get one anyways, degree or no) and work your way up.

A school is just a good place to get feedback, get some project ideas, and meet like minded students. It doesn’t matter if spend $100/credit or $1000.

Here’s another good article on the film school debate, rising film school costs, and the ever dropping costs of pro camera equipment.

 

 

Wacom Tablets and Repetitive Stress Injuries

Like Digital Anarchy On Facebook

 

I’ve written about this before, but Thanksgiving came along this year and I left on a 5 day, two city trip without my Wacom tablet. Which reminded me exactly why I’m thankful for the tablet.

The downside to running Digital Anarchy is that I don’t really get many  days off. Usually I’m working in some capacity at least a couple hours a day even on vacations. For trips (like Thanksgiving) that involve plane flights and other downtime, it’s usually a lot more than two hours. (Not really complaining, just pointing out that it’s a thing. There’s plenty of awesome stuff about being Chief Executive Anarchist and coming up with cool video plugins for y’all)

I’ve used a Wacom tablet as a mouse replacement since around 2003. I used to run a user group called Bay Area Motion Graphics. Because I and one other DA employee had RSI problems, I got a variety of ‘ergonomic’ devices and had DA folks and members of BAMG try them out. BAMG was mostly video editors and motion graphic artists, to give you some idea of who was using them.

Wacom tablet used with Digital Anarchy Video PluginsExtra space on your keyboard drawer, yes. Clean desk, no.

We swapped around the weird looking keyboards, joystick mouse things, trackballs, tablets, and other oddments. We then got together and decided which devices seemed to offer relief to the most people.

One of the devices that stood out, especially for me, was the Wacom tablet. Once you get used to using it as a mouse replacement it’s really an awesome device. I have multiple tablets and use them constantly in the office and while traveling. It makes using the computer much less painful.

That’s in stark contrast to the last few days. No tablet, so I’ve been forced to use the track pad on the two computers I carry around. My wrists immediately started to ache and tingle. Not good. It’s amazing that for the most part I have no problems when using the tablets, but then after a couple days not using them, much of the pain comes back. Of course, RSI  is a whole body thing. Not only do your wrists hurt, but you’re in a less ergonomic position (f’ing hotel chairs) so my shoulders and back hurt as well.

Why are the Wacom tablets so effective for helping with RSI? I’m not sure to be honest. But I feel that 1) you’re holding the pen as you would a normal pen. This is a skill you’ve been working on since you were a small child and the muscle memory is very strong. 2) you’re not just using one body part over and over again (like your index finger on a mouse). You’re using your whole hand, wrist and arm. I feel like this distributes the stress over a greater area.

Whatever the case, for me, the tablets have been a godsend. It takes some time to get really familiar with them, but it’s been well worth it for me. Of course, it’s just one part of having an ergonomic workstation but it’s a big one (a great chair is another big one). Your health is critical. Take care of yourself.

Why VR Will Fail. (and AR too)

Like Digital Anarchy On Facebook

 

First off, neither will fail completely. VR will succeed in games and AR will end up like the Segway… used by mall cops and tourists. And, yeah, there’ll be some industrial and entertainment (e.g. theme park rides) applications for both.

But widespread consumer use? No. Fail. Why? Because most people don’t care. At all.

Geeks LOVE, LOVE, LOVE this type of stuff because it’s extremely cool technology. And it’s true, the tech behind it is amazing. However, this does not matter to most people. For most people what matters is 1) does this make my experience better MOST of the time and 2) is it easy to use? Or, more simply: Does this make my experience so much better that it’s worth the effort required to learn and use it?

We ran into this problem with Web 3D when I worked for on Cult3D for Cycore, which was a browser plugin to let you view 3D objects on the web. Really cool tech. Cult3D, and 3D on the web pretty much completely failed. Why? Because a sneaker in 3D gets you no closer to trying it on than a bunch of photos.

And that 3D sneaker costs a LOT more to create than a few photos.

But VR and AR are different than Web3D! No, sorry, they’re not. It’s going to be the same problem. The content creation costs are going to be a killer and does it really add anything to the experience? Is it the order of magnitude better that it needs to be for most people to invest the time/effort/money in it? Especially since it requires glasses you wouldn’t otherwise need, particularly clunky, tech looking ones.

For example, the Magic Leap (VR/AR technology startup) website shows a bunch of schoolkids looking at a virtual seahorse. Ok, that’s going to be super awesome… until the novelty wears off. Then… is that virtual seahorse better than high resolution photos and videos showing the seahorse in it’s natural environment that can be shown on a smartboard or HD TV (tech that schools already have)? No, probably not.

And do you really think schools are going to outfit entire schools with VR/AR tech and the expensive content? Most schools can’t even buy one smartboard for each classroom… to say nothing of training teachers, many of whom are not very tech literate.

But wait, I’ll be able to see bus stops and find restaurants just by looking around! How often do you actually need to do this? You’re going to wear glasses you don’t need so that you can be visually bombarded with virtual signage and more information? Most of us are already in information overload. For the few times a day I need to check bus schedules, Yelp, or Lyft I don’t need AR. AR might  be marginally better than having to look at my phone, but it’s something I need to WEAR. And how do you control it? waving your hands around? A fanny pack controller attached to your belt?

One other issue is one that dogged 3D TV. People are social and want to connect, especially by looking in each other’s eyes. I don’t like talking to people that can’t stop looking at their phone. If I can’t see their eyes or if their eyes are constantly glazed over looking at the retina display… it’s a big problem.

And no, most people don’t want to live in virtual worlds. Yes, for gaming, great. Real life? Give me a f’ing break. Nobody wants to see your dragon avatar walking around the airport.

So between the high content creation costs, the difficulty/cost using it, social impediments and the fact that in most cases it’s not going to improve the experience by an order of magnitude, I don’t see it succeeding as a common, every day thing for personal use .

EL Capitan, Plugins and the Anarchist

Like Digital Anarchy On Facebook

 

First off, the important bit: All the current versions of our plugins are updated for El Capitan and should be working, regardless of host application (After Effects, Premiere Pro, Final Cut Pro, Davinci Resolve, etc). So you can go to our demo page:

http://digitalanarchy.com/demos/main.html

And download the most recent version of your plugins.

If you haven’t upgraded to El Capitan, I’ll add to the chorus of people saying… Don’t. Overall we’re disappointed by Apple as continues its march towards making the Mac work like the iPhone. Making professional uses more and more obsolete. They’re trying way too hard to make the machines idiot proof and in the process dumbing down what can be done with it.

One of the latest examples is, of all things, Disk Utility. You can no longer make a RAID using it and have to use a terminal command. They’ve removed other functionality as well, but for many professional users RAIDs are essential as is Disk Utility. However, it’s now been crippled.

Of course, then there’s Final Cut Pro (which has gotten better but still doesn’t feel like a professional app to many people), Photos which replaced Apple’s pro app Aperture, and the Mac Pro trashcan. (kind of sad that when we need a ‘new’ Mac, usually we buy a 2010-12 12-core Mac Pro, they outperform our D500 trashcan)

Apple isn’t alone in this ‘dumbing down’ trend. Just look at latest releases of Acrobat (which I’ve heard referred to as the Fischer Price version) and Lightroom.

Note to Application Developers- Just because we’re doing a lot of things with our phones does not mean we want to do everything on them or have our desktop apps work like phone apps. There’s a difference between simplicity, making the user experience clear and intuitive but retaining features that make the apps powerful, and stupidity, i.e. making the apps idiot proof.

Anyways, end of rant… I spend a fair amount of time thinking about software usability, since we have to strike that balance between ease of use and power with our own video plugins, and using the host applications and OS professionally. So this ‘dumbing down’ concerns me both for my personal uses and having to help DA customers navigate new ‘features’ that affect our photo and video plugins.

Cheers,

Jim Tierney
Chief Executive Anarchist
Digital Anarchy

Easy Ways of Animating Masks for Use with Beauty Box in After Effects, Premiere, and Final Cut Pro

Like Digital Anarchy On Facebook

 

We have a new set of tutorials up that will show you how to easily create masks and animate them for Beauty Box. This is extremely useful if you want to limit the skin retouching to just certain areas like the cheeks or forehead.

Traditionally this type of work has been the province of feature films and other big budget productions that had the money and time to hire rotoscopers to create masks frame by frame. New tools built into After Effects and Premiere Pro or available from third parties for FCP make this technique accessible to video editors and compositors on a much more modest budget or time constraints.

Using Masks that track the video to animate them with Beauty Box for more precise retouching

How Does Retouching Work Traditionally?

In the past someone would have to create a mask on Frame 1 and  move forward frame by frame, adjusting the mask on EVERY frame as the actor moved. This was a laborious and time consuming way of retouching video/film. The idea for Beauty Box came from watching a visual effects artist explain his process for retouching a music video of a high profile band of 40-somethings. Frame by frame by tedious frame. I thought there had to be an easier way and a few years later we released Beauty Box.

However, Beauty Box affects the entire image by default. The mask it creates affects all skin areas. This works very well for many uses but if you wanted more subtle retouching… you still had to go frame by frame.

The New Tools!

After Effects and Premiere have some amazing new tools for tracking mask points. You can apply bezier masks that only masks the effect of a plugin, like Beauty Box. The bezier points are ‘tracking’ points. Meaning that as the actor moves, the points move with him. It usually works very well, especially for talking head type footage where the talent isn’t moving around a lot. It’s a really impressive feature. It’s  available in both AE and Premiere Pro. Here’s a tutorial detailing how it works in Premiere:

After Effects also ships with Mocha Pro, another great tool for doing this type of work. This tutorial shows how to use Mocha and After Effects to control Beauty Box and get some, uh, ‘creative’ skin retouching effects!

The power of Mocha is also available for Final Cut Pro X as well. It’s available as a plugin from CoreMelt and they were kind enough to do a tutorial explaining how Splice X works with Beauty Box within FCP. It’s another very cool plugin, here’s the tutorial:

Beauty Box Video 4.0 Released for Avid and OpenFX Apps

Like Digital Anarchy On Facebook

 

We’re excited to announce that Beauty Box Video 4.0 is now available for Avid and OpenFX Apps: Davinci Resolve, Assimilate Scratch, Sony Vegas, NUKE, and more. This is in addition to After Effects, Premiere Pro, and Final Cut Pro which were announced in April.

Beauty Box Video 4.0 adds real time rendering to the high quality, automatic skin retouching that Beauty Box is famous for. It’s not only the best retouching plugin available but it’s now one of the fastest, especially on newer graphics cards like the Nvidia GTX 980. We’re seeing real time or near real time performance in Premiere Pro, Resolve, and FCP. Other apps may not see quite that performance but they still get a significant speed increase over what was possible in Beauty Box 3.0.

Easily being able to retouch video is becoming increasingly important. HD is everywhere and 4K is widely available allowing viewers to see more detail on closeups of talent than ever before. This makes skin or makeup problems much more visible and being able to apply digital makeup easily is critical to high quality productions.

Beauty Box 4.0 on 4K Footage
Beauty Box will smooth out all skin areas, so blemishes on arms are covered up as well as wrinkles or spots on the face, as you can see in this still from a cooking show.

You can also incorporate masks to limit the retouching to just certain areas like cheeks or the talent’s forehead. (as can be seen in this tutorial using Premiere Pro’s tracking masks)

So head over to digitalanarchy.com for more info and to download a free trial and free tutorials on how to get started and more advanced topics. You’ll be blown away by the ease of use, high quality retouching, and now… speed!

Using a Nvidia GTX 980 (or Titan or Quadro) in a Mac Pro

Like Digital Anarchy On Facebook

 

As many of you know, we’ve come out with a real time version of Beauty Box Video. In order for that to work, it requires a really fast GPU and we LOVE the GTX 980. (Amazing price/performance) Nvidia cards are generally fastest  for video apps (Premiere, After Effects, Final Cut Pro, Resolve, etc) but we are seeing real time performance on the higher end new Mac Pros (or trash cans, dilithium crystals, Job’s Urn or whatever you want to call them).

BUT what if you have an older Mac Pro?

With the newer versions of Mac OS (10.10), in theory, you can put any Nvidia card in them and it should work. Since we have lots of video cards lying around that we’re testing, we wondered if our GTX 980, Titan and Quadro 5200 would work in our Early 2009 Mac Pro. The answer is…

Nvidia GTX GPU in Mac Pro

YES!!!

So, how does it work? For one you need to be running Yosemite (Mac OS X 10.10)

A GTX 980 is the easier of the two GeFroce cards, mainly because of the power needed to drive it. It only needs two six-pin connectors, so you can use the power supply built into the Mac. Usually you’ll need to buy an extra six-pin cable, as the Mac only comes standard with one, but that’s easy enough. The Quadro 5200 has only a single 6-pin connector and works well. However, for a single offline workstation, it’s tough to justify the higher price for the extra reliability the Quadros give you. (and it’s not as fast as the 980)

The tricky bit about the 980 is that you need to install Nvidia’s web driver. The 980 did not boot up with the default Mac OS driver, even in Yosemite. At least, that’s what happened for us. We have heard of reports of it working with the Default Driver, but I’m not sure how common that is. So you need to install the Nvidia Driver Manager System Pref and, while still using a different video card, set the System Pref to the Web Driver. As so:

Set this to Web Driver to use the GTX 980
Set this to Web Driver to use the GTX 980

You can download the Mac Nvidia Web Drivers here:

For 10.10.2

For 10.10.3

For 10.10.4

Install those, set it to Web Driver, install the 980, and you should be good to go.

What about the Titan or other more powerful cards?

There is one small problem… the Mac Pro’s power supply isn’t powerful enough to handle the card and doesn’t have the connectors. The Mac can have two six pin power connectors, but the Titan and other top of the line cards require a 6 pin and an 8 pin or even two 8-pin connectors. REMINDER: The GTX 980 and Quadro do NOT need extra power. This is only for cards with an 8-pin connector.

The solution is to buy a bigger power supply and let it sit outside the Mac with the power cables running through the expansion opening in the back.

As long as the power supply is plugged into a grounded outlet, there’s no problem with it being external. I used a EVGA 850W Power Supply, but I think the 600w would do. The nice thing about these is they come with long cables (about 2 feet or so) which will reach inside the case to the Nvidia card’s power connectors.

Mac Pro external power supply

One thing you’ll need to do is plug the ‘test’ connector (comes with it) into the external power supply’s motherboard connector. The power supply won’t power on unless you do this.

Otherwise, it should work great! Very powerful cards and definitely adds a punch to the Mac Pros. With this setup we had Beauty Box running at about 25fps (in Premiere Pro, AE and Final Cut are a bit slower). Not bad for a five year old computer, but not real time in this case. On newer machines with the GTX 980 you should  be getting real time play back. It really is a great card for the price.

Creative Cloud 2015 and After Effects, Premiere Pro Plug-ins

All of our current plugins have been updated to work with After Effects and Premiere Pro in Creative Cloud 2015. That means Beauty Box Video 4.0.1 and Flicker Free 1.1 are up to date and should work no problem.

Flicker Free 1.1 is a free update which you can download here: http://digitalanarchy.com/demos/main.html

What if I have an older plugin like Beauty Box 3.0.9? Do I have to pay for the upgrade?

Yes, you probably need to upgrade and it is a paid upgrade. After Effects changed the way it renders and Premiere Pro changed how they handle GPU plugins (of which Beauty Box is one). The key word here is probably. Our experience so far has been mixed. Sometimes the plugins work, sometimes not.

Premiere Pro: Beauty Box 3.0.9 seems to have trouble in Premiere if it’s using the GPU. If you turn ‘UseGPU’ off (at the bottom of the BB parameter list), it seems to work fine, albeit much slower. Premiere Pro did not implement the same re-design that After Effects did, but they did add an API specifically for GPU plugins. So if the plugin doesn’t use the GPU, it should work fine in Premiere. If it uses the GPU, maybe it works, maybe not. Beauty Box seems to not.

After Effects: Legacy plugins _should_ work but slow AE down somewhat. In the case of Beauty Box, it seems to work ok but we have seen some problems. So the bottom line is: try it out in CC 2015, if it works fine, you’re good to go. If not, you need to upgrade. We are not officially supporting 3.0.9 in Creative Cloud 2015.

– The upgrade from 3.0 is $69 and can be purchased HERE.

– The upgrade from 1.0/2.0 is $99 and can be purchased HERE.

 

The bottom line is try out the older plugins in CC 2015. It’s not a given that they won’t work, even though Adobe is telling everyone they need to update. It is true that you will most likely need to update the plugins for CC 2015 so their advice isn’t bad. However, before paying for upgrades load the plugins and see how they behave. They might work fine. Of course, Beauty Box 4 is super fast in both Premiere and After Effects, so you might want to upgrade anyways. :-)

We do our best not to force users into upgrades, but since Adobe has rejiggered everything, only the current releases of our products will be rejiggered in turn.

Creating GIFs from Video: The 4K Animated GIF?

Like Us On FacebookLIKE Digital Anarchy!

I was at a user group recently and a video editor from a large ad agency was talking about the work he does.

‘web video’ encompasses many things, especially when it comes to advertising. The editor mentioned that he is constantly being asked to create GIF animations from the video he’s editing. The video may go on one site, but the GIF animation will be used on another one. So while one part of the industry is trying to push 4K and 8K, another part is going backwards to small animated GIFs for Facebook ads and the like.

Online advertising is driving the trend, and it’s probably something many editors deal with daily… creating super high resolution for the broadcast future (which may be over the internet), but creating extremely low res versions for current web based ads.

Users want high resolution when viewing content but ads that aren’t in the video stream (like traditional ads) can slow down a users web browsing experience and cause them to bounce if the file size is too big.

Photoshop for Video?

Photoshop’s timeline is pretty useless for traditional video editing. However, for creating these animated GIFs, it works very well. Save out the frames or short video clip you want to make into a GIF, import them into Photoshop and lay them out on the Timeline, like you would video clips in an editing program. Then select Save For Web… and save it out as a GIF. You can even play back the animation in the Save for Web dialog. It’s a much better workflow for creating GIFs than any of the traditional video editors have.

So, who knew? An actual use for the Photoshop Timeline. You too can create 4K animated GIFs! ;-)

animated GIF

One particularly good example of an animated GIF. Rule #1 for GIFs: every animated GIF needs a flaming guitar.

Odyssey 7Q+ .wav Problem – How to Fix It and Import It into Your Video Editor

Like Digital Anarchy On Facebook

 

We have a Sony FS700 hanging around the Digital Anarchy office for shooting slow motion and 4K footage to test with our various plugins ( We develop video plugins for Premiere Pro, After Effects, Avid, Final Cut Pro, Resolve, etc., etc.) . In order to get 4K out of the camera we had to buy an Odyssey 7Q+ from Convergent Designs (don’t you love how all these cameras are ‘4K – capable’, meaning if you want 4K, it’s another $2500+. Yay for marketing.)

(btw… if you don’t care about the back story, and just want to know how to import a corrupted .wav file into a video editing app, then just jump to the last couple paragraphs. I won’t hold it against you. :-)

The 7Q+ overall is a good video recorder and we like it a lot but we recently ran into a problem. One of the videos we shot didn’t have sound. It had sound when played back on the 7Q+, but when you imported it into any video editing application. no audio.

The 7Q+ records 4K as a series of .dng files with a sidecar .wav file for the audio. The wav file had the appropriate size as if it had audio data (it wasn’t a 1Kb file or something) but importing into FCP, Premiere Pro, Quicktime, or Windows Media Player showed no waveform and no audio.

Convergent Designs wasn’t particularly helpful. The initial suggestion was to ‘rebuild’ the SSD drives. This was suggested multiple times, as if it was un-imaginable this wouldn’t fix it and/or I was an idiot not doing it correctly. The next suggestion was to buy file recovery software. This didn’t really make sense either. The .dng files making up the video weren’t corrupted, the 7Q+ could play it back, and the file was there with the appropriate size. It seemed more likely that the 7Q+ wrote the file incorrectly, in which case file recovery software would do nothing.

So Googling around for people with similar problems I discovered 1) at least a couple other 7Q users have had the same problem and 2) there were plenty of non-7Q users with corrupted .wav files. One technique for the #2 folks was to pull them into VLC Media Player. Would this work for the 7Q+?

YES! Pull it into VLC, then save it out as a different .wav (or whatever) file. It then imported and played back correctly. Video clip saved and I didn’t need to return the 7Q+ to Convergent and lose it for a couple weeks.

Other than this problem the Odyssey 7Q+ has been great… but this was a pretty big problem. Easily fixed though thanks to VLC.

4K Showdown! New MacPro vs One Nvidia GTX 980

Like Digital Anarchy On Facebook

 

For NAB this year we finally bought into the 4K hype and decided to have one of our demo screens be a 4K model, showing off Beauty Box Video and Flicker Free in glorious 4K.

NAB Booth Beauty Box Video and Flicker Free in 4k
The Digital Anarchy NAB Booth

So we bought a 55” 4K Sony TV to do the honors. We quickly realized if we wanted to use it for doing live demos we would need a 4K monitor as well. (We could have just shown the demo reel on it) For live demos you need to mirror the computer monitor onto the TV. An HD monitor upscaled on the 4K TV looked awful, so a 4K monitor it was (we got a Samsung 28″, gorgeous monitor).

Our plan was to use our Mac Pro for this demo station. We wanted to show off the plugins in Adobe’s AE/Premiere apps and Apple’s Final Cut Pro. Certainly our $4000 middle of the line Mac Pro with two AMD D500s could drive two 4K screens. Right?

We were a bit dismayed to discover that it would drive the screens at the cost of slowing the machine down to unusable. Not good.

For running Beauty Box in GPU accelerated mode, our new favorite video card for GPU performance is Nvidia’s GTX 980. The price/performance ratio is just amazing. So we figured we’d plug the two 4K screens into our generic $900 Costco PC that had the GTX 980 in it and see what kind of performance we’d get out of it.

Not only did the 980 drive the monitors, it still ran Beauty Box Video in real time within Premiere Pro. F’ing amazing for a $550 video card.

The GTX 980 single handedly knocked out the Mac Pro and two AMD D500s. Apple should be embarrassed.

I will note, that for rendering and using the apps, the Mac Pro is about on par with the $900 PC + 980. I still would expect more performance from Apple’s $4000 machine but at least it’s not an embarrassment.

FCP 7 Is Dead. It’s Time to Move On.

It’s been almost 4 years since the last update of FCP 7. The last officially supported OS was 10.6.8. It’s time to move on people.

Beauty Box Video 4.0 (due out in a month) will be our first product that does not officially support FCP 7.

It’s a great video editor but Apple make it very hard to support older software. Especially if you’re trying to run it on newer systems. If FCP 7 is a mission critical app for you, you’re taking a pretty big risk by trying to keep it grinding along. We started seeing a lot of weird behaviors with it and 10.9. I realize people are running it successfully on the new systems but we feel there are a lot of cracks beneath the surface. Those are only going to get more pronounced with newer OSes.

I know people love their software, hell there are still people using Media 100, but Premiere Pro, Avid, and even FCP X are all solid alternatives at this point. Those of us that develop software and hardware can’t support stuff that Apple threw under the bus 3 and a half years ago.

We will continue to support people using Beauty Box 3.0 with FCP 7 on older systems (10.8 and below) but we can’t continue to support it when most likely the problems we’ll be fixing are not caused by our software but by old FCP code breaking on new systems.

iPhone 6 vs Sony FS700: Comparison of Slow Motion Modes (240fps and Higher)

Like Digital Anarchy On Facebook

 

Comparing slow motion modes of the iPhone 6 vs the Sony FS700

The Sony FS700 is a $6000 video camera that can shoot HD up to 960fps or 4K at 60fps. It’s an excellent camera that can shoot some beautiful imagery, especially at 240fps (the 960fps footage really isn’t all that, however).

The iPhone 6 is a $700 phone with a video camera that shoots at 240fps. I thought it’d be pretty interesting to compare the iPhone 6 to the Sony FS700. I mean, the iPhone couldn’t possibly compare to a video camera that is dedicated to shooting high speed video, right? Well, ok yes, you’re right. Usually. But surprisingly, the iPhone 6 holds its own in many cases and if you have a low budget production, could be a solution for you.

Let’s compare them.

kickboxing at 240fps

First the caveats:

1: The FS700 shoots 1080p, the iPhone shoots 720p. Obviously if your footage HAS to be 1080p, then the iPhone is a no go. However, there are many instances where 720p is more than adequate.

2: The iPhone has no tripod mount. So you need something like this Joby phone tripod:

3: You can’t play the iPhone movies created in slow motion on Windows. The Windows version of QuickTime does not support the feature. They can be converted with a video editing app, but this is a really annoying problem for Windows users trying to shoot with the iPhone. The Sony movies play fine on a Mac or Windows machine.

4: The iPhone will automatically try and focus and adjust brightness. This is the biggest problem with the iPhone. If you’re going to shoot with the iPhone you HAVE to consider this. We’ll discuss it a lot more in this article.

5: The iPhone does let you zoom and record, but it’s not an optical zoom so it’s lower quality than the non-zoomed image. With the FS700 you can change lenses, put on a sweet zoom lens, and zoom in to your hearts content. But that’s one of the things you pay the big bucks for. We did not use the iPhone’s zoom feature for any of these shots, so in some cases the iPhone is a bit wider than the FS700 equivalent.

 

The Egg

Our first example is a falling egg. The FS700 footage is obviously better in this case.

The iPhone does very poorly in low light. You can see this in the amount of noise on the black background. It’s very distracting. Particularly since the egg portion IS well lit. Also, you’ll notice that the highlight on the egg is completely blown out.

Unfortunately, there’s nothing you can do about this except light better. One of the problems with the iPhone is the automatic brightness adjustment. It shows up here in the blown out highlight, with no way to adjust the exposure. You get what you get, so you NEED to light perfectly.

In the video there’s also an example of the FS700 shooting at 480fps. The 960fps mode of the FS700 is pretty lacking, but the 480fps does produce pretty good footage. For something like the egg, the 480fps has a better look since the breaking of the egg happens so fast. Even the 240fps isn’t fast enough to really capture it.

All the footage is flickering as well. This is a bit more obvious with the FS700 because there’s no noise in the background. The 480fps footage has been de-flickered with Flicker Free. Compare it with the 240fps to see the difference.

 

The MiniVan

In this case we have a shot of some cars a bit before sunset. This works out much better for the iPhone, but not perfectly. It’s well lit, which seems to be the key for the iPhone.

Overall, the iPhone does a decent job, however it has one problem. As the black van passes by, the iPhone auto-adjusts the brightness. You can see the effect this has by looking the ‘iPhone 6’ text in the video. The text doesn’t change color but the asphalt certainly does, making the text look like it’s changing. This does make the van look better, but it changes the exposure of the whole scene. NOT something you want if you’re shooting for professional uses.

The FS700 on the other hand, we can fix the aperture and shutter speed. This means we keep a consistent look throughout the video. You would expect this with a pro video camera, so no surprise there. It’s doing what it should be doing.

However, if you were to plan for the iPhone’s limitation in advance and not have a massive dark object enter your scene, you would end up with a pretty good slow motion shot. The iPhone is a bit softer than the Sony, but it still looks good!

Also note that when the FS700 is shooting at 480fps, it is much softer as well. This has some advantages, for example the wheels don’t have anywhere near as much motion blur as the 240fps footage. The overall shot is noticeably lower quality, with the bushes in the background being much softer than the 240fps footage.

 

The Plane! The Plane!

Next to the runway at LAX, there’s a small park where you can lay in the grass and watch planes come in about 40 feet above as they’re about to land. If you’ve never seen the underbelly of an A380 up close, it’s pretty awesome. We did not see that when doing this comparison, but we did see some other cool stuff!

Most notably we saw the problem with the iPhone’s inability to lock focus. Since the camera has nothing to focus on, when the plane enters the frame it’s completely out of focus. The iPhone 6 can’t resolve it in the few seconds it’s overhead, so the whole scene is blurry.

Compare that to the FS700 where we can get focus on one plane and when the next one comes in, we’re in focus and capture a great shot.

The iPhone completely failed this test, so the Sony footage is easily the hands down winner.

 

The Kickboxer

One last example where the iPhone performs adequately.

The only problem with this shot is the amount of background noise. As mentioned the iPhone doesn’t do a great job in low light, so there’s a lot of noise on the black background. Because of the flimsy phone tripod, it shakes a lot more as well. However, overall the footage is ok and would probably look much better if we’d used a white background. This footage also has a flicker problem and we used Flicker Free again on the 480fps footage to remove it. You’ll notice the detail of the foot and chalk particles are quite good on the iPhone. Not as good as the FS700, but that’s not really what we’re asking.

We want to know if Apple’s iPhone 6 can produce slow motion, 240fps video that’s good enough for an indie film or some other low budget production. (or even a high budget production where you have a situation you don’t want to (or can’t) put a $6000 camera) If you consider the caveats about the iPhone not being able to lock focus, the auto-adjusting brightness, and shooting in 720p, I think the answer is yes. If you take all that into consideration and plan for it, the footage can look great. (but, yeah… I’m not trading in my FS700 either. ;-)

Samsung Galaxy S5 Does NOT Shoot 240fps. It Shoots 120fps and Plays It Back at 15fps.

Like Digital Anarchy On Facebook

 

Apple’s iPhone 6 and the Samsung Galaxy S5 both shoot 240fps (or so you might think… 1/8th speed at 30fps is 240fps). Since we make Flicker Free, a plugin that removes flicker that occurs when shooting at 240fps, I thought it’d be cool to do a comparison of the two phones and post the results.

However, there was a problem. The footage from the Galaxy S5 seemed to be faster than the iPhone. After looking into a number of possibilities, including user error, I noticed that all the S5 footage was playing back in Quicktime Player at 15fps. Could it be that the Samsung S5 was actually shooting in 120fps and playing it back at 15fps to fake 240fps? Say it’s not so! Yep, it’s so.

To confirm this I took a Stopwatch app and recorded it with the Galaxy S5 at 1/8th speed (which should be 240fps if you assume a 30fps play back like normal video cameras). You can see the result here:

If the S5 was truly shooting at 240fps, over one second the frame count should be 240. It’s not. It’s 120. If you don’t trust me and want to see for yourself, the original footage from the S5 can be downloaded HERE: www.digitalanarchy.com/downloads/samsung_120fps.zip

Overall, very disappointing. It’s a silly trick to fake super slow motion. It’s hardly shocking that Samsung would use a bit of sleight of hand on the specs of their device, but still. Cheesy.

 

You might ask why this makes a difference. It’s still playing really slow. If you’re trying to use it in a video editor and mixing it with footage that IS shot at 30fps (or 24fps), the 15fps video will appear to stutter. Also, from an image quality standpoint, where you really see the problem is in the detail and motion blur. As you can see in this example:

iphone 6 vs samsung galaxy s5 240fps

Also, the overall image quality of the iPhone was superior. But that’s something I’ll talk about when I actually compare them! That’s coming up next!

VFX Students: Get Ready to Work for $600/mo.

I was talking with the owner of a mid-sized effects house in LA last weekend. They’ve always done most of their work where they could get subsidies to pay for part of salaries… Canada, Singapore, etc.

However, the staff for a new production is in Indonesia, where the artists are making $600/mo. They’re already doing production work and it may not be top tier, but it’s good.

Prices for VFX work have been going down for quite a while and it’s probably not going to stop. Yes, there are still jobs in the US, but the trend is moving towards countries where staff can be had for a lot less. The effort to unionize may help, but probably not as much as folks think. An electrician has to be on set. Most VFX work doesn’t require that. It can be done anywhere.

So, where does that leave students? I don’t have a lot of respect for the schools promising careers in VFX. They don’t mention the state of the industry while they’re happily telling students how to fill out the government loan forms. The end result is that you have students graduating these places with a lot of debt and not a lot of job opportunities.

There are jobs for the top graduates, but it’s been my experience that these students would be better off doing online training (www.fxphd.com for example), working on their own projects and getting an internship. They’re probably going to excel no matter where they’re at. These are, of course, the folks that get featured in ‘Alumni Stories’. But instead of ‘Alumni Stories’ I’d much rather see the percentage of ex-students working full time in the VFX industry. The reason you don’t see that statistic is that it’d be pretty depressing.

So if you’re thinking about a career in VFX, before you sign up for $20,000/yr in debt, consider the $600/mo the VFX artists are making in Indonesia. There are other ways to break into the industry than an expensive school. As an artist you may not want to think about finances, but I can assure you… once you have to start paying that back, you’ll be thinking a lot about it.

How Final Cut Pro X Caches Render Files (and how to prevent Beauty Box from re-rendering)

What causes Final Cut Pro X to re-render? If you’ve ever wondered why sometimes the orange ‘unrendered’ bar shows up when you make a change and sometimes it doesn’t… I explain it all here. This is something that will be valuable to any FCP user but can be of the utmost importance if you’re rendering Beauty Box, our plugin for doing skin retouching and beauty work on HD/4K video. (Actually we’re hard at work making Beauty Box a LOT faster, so look for an announcement soon!)

Currently, if you’ve applied Beauty Box to a long clip, say 60 minutes, you can be looking at serious render times (this can happen for any non-realtime effect), possibly twelve hours or so on slower computers and video cards. (It can also be a few hours, just depends on how fast everything is)

FCP showing that footage is unrendered

Recently we had a user with that situation. They had a logo in .png format that was on top of the entire video being used as a bug. So they rendered everything out to deliver it, but, of course, the client wanted the bug moved slightly. This caused Final Cut Pro to want to re-render EVERYTHING, meaning the really long Beauty Box render needed to happen as well. Unfortunately, this is just the way Final Cut Pro works.

Why does it work that way and what can be done about it?

Continue reading How Final Cut Pro X Caches Render Files (and how to prevent Beauty Box from re-rendering)

Why Doesn’t FCP X Support Image Sequences for Time Lapse (among other reasons)

In the process of putting together a number of tutorials on time lapse (particularly stabilizing it), I discovered that FCP X does not import image sequences. If you import 1500 images that have a name with sequential numbers, it imports them as 1500 images. This is a pretty huge fail on the part of FCP. Since it is a video application, I would expect it to do what every other video application does and recognize the image sequence as VIDEO.  Even PHOTOSHOP is smart enough to let you import a series of images as an image sequence and treat it as a video file. (and, no, you should not be using the caveman like video tools in Photoshop for much of anything, but I’m just sayin’ it imports it correctly)

There are ways to get around this. Mainly use some other app or Quicktime to turn the image sequence into a video file.  I recommend shooting RAW when shooting time lapse,  so this means you have to pull the RAW sequence into one of the Adobe apps anyways (Lightroom, After Effects, Premiere) for color correction.  It would be much nicer if FCP just handled it correctly without having to jump through the Adobe apps. Once you’re in the Adobe system, you might as well stay there, IMO.

No, I’m not a FCP X hater. I just like my apps to work the way they should… just as I tore into Premiere and praised FCP for their .f4v (Flash video) support in this blog post.

Time Lapse image sequence in Final Cut Pro failing to load as a single video file

 

What’s wrong with this picture?

Time Lapsing Around Italy with Flicker Free

Stephen Smith, a long time videographer, used a recent trip to Italy as an opportunity to hone is time lapse skills. The result is a compilation of terrific time lapse sequences from all over Italy.

He used Flicker Free to deflicker the videos and use Premiere Pro and After Effects for editing, and Davinci Resolve for color correction. It’s a great example of how easily Flicker Free fits into pretty much any workflow and produces great results.

 

Italy Time Lapse from Stephen Smith on Vimeo.

 

Since he was traveling with his wife, it allowed her to explore areas where he was shooting more thoroughly. This is not always the case. Significant others are not always thrilled to be stuck in one place for an hour while you stand around watching your camera take pictures!

Although, he said it did give him an opportunity to watch how agressive the street vendors were and to meet other folks.

We’re happy that he gave us a heads up about the video which is on Vimeo or you can see it below. Of course, we’re thrilled he used Flicker Free on it as well. :-)

 

Why does Final Cut Pro handle Flash Video f4v files better than Premiere Pro?

First off, if you want Flash’s .f4v files to work in FCP X, you need to change the extension to .mp4. So myfile.f4v becomes myfile.mp4

I’ve been doing some streaming lately with Ustream. It’s a decent platform, but I’m not particularly in love with it (and it’s expensive). Anyways, if you save the stream to a file, it saves it as a Flash Video file (.f4v). The file itself plays back fine. However, if you pull it into Premiere Pro for editing, adding graphics, etc., PPro can’t keep the audio in sync. Adobe… WTF? It’s your file format!

Final Cut Pro X does not have this problem. As mentioned, you need to change the file extension to .mp4, but otherwise it handles it beautifully.

Even if you pull the renamed file into Premiere, it still loses the audio sync. So it’s just a complete fail on Adobe’s part. FCP does a terrific job of handling this even on long programs like this 90 minute panel discussion.

Here’s the Final Cut Pro file, saved out to a Quicktime file and then uploaded to YouTube:

Here’s the Premiere Pro video, also saved out to Quicktime and uploaded. You’ll notice it starts out ok, but then quickly loses audio sync. This is typical in my tests. The longer the video the more out of sync it gets. In this 30 second example it’s not too out of sync, but it’s there.

Breaking Down Using Beauty Box in a Music Video

It’s always cool to see folks posting how they’ve used Beauty Box Video. One of the most common uses is for music videos, including many top artists. Most performers are a little shy about letting it be known they need retouching, so we get pretty excited when something does get posted (even if we don’t know the performer). Daniel Schweinert just posted this YouTube and blog post breaking down his use of Beauty Box Video (and Mocha) for a music video in After Effects. Pretty cool stuff!

Here’s a link to his blog with more information:
http://schweinert.com/blog/files/49c88ab71626af3ecef80da0a92c9677-47.html

Flicker Free and the Bloody Beetroots feat. Tommy Lee

I think the most exciting thing about Flicker Free plugin is that people are willing to let us talk about it when they use it. Beauty Box Video gets used all the time on very high profile projects but no one wants to admit it publicly. It’s not worth some video editors job to say so-and-so pop star doesn’t have the flawless skin it looks like she has.

Flicker Free isn’t even officially released yet and we’re getting producers emailing us, letting us know it saved a shot (or shots) in their video AND they have no problem with us posting it. It’s awesome!

Such is the case with the music video from the Bloody Beetroots featuring Tommy Lee (of Motley Crue fame). One of the LED lights on the set was causing severe strobing in some shots. This isn’t regular flicker, it’s sort of rolling bands. It’s something we’ve seen only from LED lights and possibly electrical interference (the iPhone example in the FF demo reel is a good example of the problem). Flicker Free was the only thing that got rid of it. Just another big problem the plugin can solve. You can check out the final video below.

FCP X Blue Frame (or Screen) Problem : Update FCP Please

We’ve gotten a couple tech report requests about this lately, so it’s worth noting.

In earlier versions of FCP X, there’s a bug where third party effects will sometimes render a blue frame. This was solved in 10.0.9 (I think). So it’s been fixed for some time, but folks need to upgrade (which is FREE).

For plugins to run correctly you REALLY need to update FCP X. This effects all plugin developers, not just Digital Anarchy. I understand the hesitation about upgrading an app that’s working for you, but in this case you really should upgrade. It’s free and will prevent you from eventually running into the problem… probably right in the middle of a big project (when you shouldn’t upgrade) with a plugin you HAVE to use. There’s no way to get rid of the blue frame if it’s happening other than to upgrade.

Here’s a link to the Apple Knowledge base with info about it:

http://support.apple.com/kb/TS4182?viewlocale=en_US

Customers That Piss Me Off

Let’s say you did some work for a client 3 or 4 years ago. A promotional video featuring upper management or something. They come back now and want you to redo the video with current management but everything else can stay the same. Just re-shoot a few people and drop them into the old video. Of course, because this is clearly so easy and they paid you once before, they want you to do it for free. What would you tell them?

We have people do this to us all the time. People who buy a new Mac, upgrade to FCP X, and get all pissy when we tell them they’ll have to buy an upgrade from us. Then they threaten to run off to BitTorrent because, you know, they paid us once four years ago.

It requires a TON of work to keep software working with all the changes Apple, Adobe, Nvidia and everyone else keeps making. Most of this work we do for free because they’re small incremental changes. Every time you see Beauty Box v3.0.1 or 3.0.2 or 3.0.7 (the current one)… you can assume a lot of work went into that and you don’t have to pay anything. However, eventually the changes add up or Apple (most of the time it’s Apple) does some crazy thing that means we need to rewrite large portions of the plug-in. As happened when FCP went from 7 to X. It’s too much work to do for free. We still need to eat and pay rent.

We want to support our customers. The reason we develop this stuff is because it’s awesome to see the cool things you all do with what we throw out there. However, shelling out $199 does not mean we can support you indefinitely. How much money has that software made you or how much time has it saved you in the three or four years since you bought it? We want to support you, but if we go out of business, that’s probably not going to benefit either of us.

We realize most of our customers understand what it takes to keep our software up to date. We are very grateful to you. We also realize forced upgrades suck and understand the frustration that goes with them. (we buy a lot of software too) Just understand that as a third party/plug-in developer we’re highly dependent on other companies. When one of those companies makes a big change, it usually takes a lot of work to keep things running.

Sorry for the rant, just something that needed to be said (and probably won’t be read by the people that need to read it). Just a little blog therapy that breaks most of the rules of Marketing 101. ;-)

It’s a 4K World! … or not.

A survey released lately seem to indicate that, despite all the marketing, there’s little consumer interest in 4K… or UltraHD as it’s now called. It’s estimated that it’s going to take until 2017, at least, for 4K televisions to make up more than 5% of the market.

Having just got back from CES and seeing all the latest and great 4K stuff, it’s fairly obvious as to why. It’s not that much better. Manufacturers are touting all sorts of crazy things to justify 4K… you can have the picture spread across two screens! You can look at more of your surveillance cameras on one screen! All sorts of great ideas that try to get around the fact that the one thing you don’t want to do is buy a 4K TV just to WATCH it. The picture is a little better than standard HD, but on a screen of less than 90″, it’s just not that noticable. I can see the eyes of the quarterback just fine in HD. I don’t need to see his nose hairs.

Of course, showing all that skin detail is great for stuff like Beauty Box, so we’re looking forward to the 4K revolution. However, if you’re worried about producing 4K content, you can probably relax. It’s going to be a long while before anyone can watch it.

Beauty Box Video or Photo Crashing Problems – How to Fix It

Beauty Box makes extensive use of your video card’s GPU (graphic processing unit) to speed the plugin up. Usually this works great and results in the plugin working quickly.

However, it can cause problems. The GPU has a lot less memory than your computer does, so it’s prone to run out of memory. This is especially true when other applications are trying to use it. After Effects, Premiere, Final Cut, and most of the other apps we plug into also use the GPU. So do many other plug-ins. All this software trying to make use of a limited resource can be problematic.

Older video cards are also a problem. Beauty Box is doing some heavy duty processing and the older video cards may not be up to the task. Particularly if you’re using very high resolution video or photos.

So what to do about it? Here’s some fixes:

Continue reading Beauty Box Video or Photo Crashing Problems – How to Fix It

Creative Cloud from a Software Design Perspective

The Creative Cloud has gotten mixed reviews from users. Many users don’t like the idea of ‘renting’ software and feel Adobe is forcing them to pay more or gouging them. While this may or may not be true, there are other reasons for Adobe wanting to make this switch.

Software is traditionally done in big releases. You work for a year or more and deliver the final product with much fanfare. This is a feast or famine type of thing… users get all or nothing and the company bets the farm that the release is all that and a bag of potato chips. This really isn’t great for either users or the company.

Continue reading Creative Cloud from a Software Design Perspective

Patent Trolls

The other day I had a friend call and ask me if I could help him out with some info about visual effects. He’s not in the industry, so I wondered about this, but I gave him a call back ready to help him out if I could. As it turns out he was looking for information about Match Moving. It’s not something I know a ton about, but I know some people that I could refer him to. I asked him why he was looking for the info. He mentioned he was working for a company doing some research for a patent they own. I asked him if this company had a product related to match moving? No. Were they thinking about building a product? No. So basically they’re a patent troll? At which point he admitted he was working for a patent troll. It’s good money apparently.

IMHO, patent trolls are the terrorists of the tech industry. (note: I’m not saying they are terrorists, they aren’t killing people)

Continue reading Patent Trolls

What Does the Creative Cloud Mean to You?

Last month we asked folks to do a survey about the Creative Cloud and how they felt about it. I thought I’d share the results as some of you may also be curious what you’re fellow users are thinking about the Creative Cloud.

Creative Cloud

Keep in mind that the survey was done before Adobe announced the price drop on Photoshop & Lightroom to $10/mo. So the data is already somewhat out of date, but maybe sheds some light on why they dropped the price.

Continue reading What Does the Creative Cloud Mean to You?

Sorry Leap Motion, Like 3DTV, Nobody Wants You

Let’s get one thing straight… consumers don’t like 3D. Well, ok, they like 3D for about 5 seconds then ADD kicks in and they get over it. (Gamers are an exception of course) Tech geeks, and especially graphics geeks, LOVE 3D.

For everyone else it’s mostly… Meh.

Nobody wants 3D on the web (except for gamers), nobody wants 3DTVs, and nobody is going to want a controller that works in 3D space (except gamers). It’s cool technology, but it is definitely a solution desperately searching for a problem. The problem is, there is no problem.

But some people don’t get it. Just as Microsoft failed to grasp that desktop computers are not tablets, Leap Motion is failing to grasp that desktop computers are not game machines. They are occasionally used as game machines, but when you’re not playing a game, you don’t want your computer to act like a Wii. I don’t think they even understand games, as gaming on the desktop is not usually a group activity like the Wii.  This failure of understanding is leading to some things like this unintentionally funny video showing how to use Final Cut Pro X to edit video with a wave of your hand. I’ve always wanted to be Tom Cruise, but… er, well, actually I don’t want to be Tom Cruise. Nevermind. So much for the one redeeming thing about that.

I can see some very niche uses for the Leap Motion device. Where there are groups of people around a screen (like Minority Report) it has the potential to be useful. For everyday computing or things like video editing? It’s ridiculous. Yes, some early adopter folks will buy it. However, it will eventually end up in the box in the back of their closet marked: ‘Things I will put in the tech museum I’ll open in my garage in 25 years’

All Installers are Updated for Creative Cloud

We’ve finally got all of our installers updated to recognize Adobe’s Creative Cloud. All of our plugins worked in CC, but you needed to point the installer to the right directory. We weren’t finding it automatically. Now we are! :-)

Actually this only affects the Photoshop Mac installers. The Windows installers look for the last version of the app you installed. If that was CC, then that’s what it would find. The Mac installers look for every installation of the app, so we have to specifically tell it to look for each new version. Probably more info than you wanted, but for those of you, uh, enjoying Creative Cloud… all of our products will install easily.

Why the NSA Spying on You Matters

When I was a kid growing up in the 80s we were often told how much better America was than the likes of East Germany or the Soviet Union. One of the reasons for this was that we, unlike them, didn’t have a secret police that spied on everything citizens did. The East Germany secret police, the Stasi, went so far as to attempt to shred all the documents it had in the last days of the regime. Germany is now  piecing together those records that got shredded and allowing people to see what was collected about them. (along with the records that did not get shredded)

We were also told that things like McCarthyism, J. Edger Hoover’s FBI blackmailing and intimidating people, Watergate, and the like were aberrations and we had laws to protect us against abuses of our constitutional rights. There are always those in government who would abuse information and people to their own ends. So those laws, like the Electronic Communications Privacy Act (ECPA), are designed to act as checks and balances and hopefully provide some protection against abuse. The types of protection provided by the ECPA are essential.

I’ve heard many people say, “Well, let them read my email. I’ve got nothing to hide!”. Really? You’ve never done anything that you didn’t wish to be publicly known? Many of us break small laws all the time, sometimes completely unaware of it. Or we do things that might look suspicious enough to get us on the ‘list’. A friend of mine recently claimed she had nothing to hide. However, she’s a dual citizen in both the US and Mexico and calls/visits Mexico frequently. If she were to tick off some petty NSA employee, they might be able to add her to the ‘list’ easily. Or it might accidentally happen. Even if you’ve done nothing wrong, the problem is the emotional and financial cost of defending yourself.

In 2009, Digital Anarchy got audited for our 2007 tax returns. Nothing strange, we just won the IRS Audit Lottery, no more, no less. However, our auditor came across a payment we made to a contractor and went apeshit, demanding all sorts of other documents. This took a great deal of my time to deal with, caused me a great deal of stress, and cost us a lot in accountant fees for finding and processing the information requested. In the end, it turns out we mis-filed a form and we were fined $900. That was it. The real cost was the $8000 in accounting and other costs we incurred, to say nothing of my time and stress. For a three person software company it was a big deal. So it goes if you somehow end up in the government’s crosshairs. As a Mexican national can you imagine the potential headaches my friend, who so cavalierly says “read my email!”, might have if she somehow ends up in those crosshairs?

The NSA is stomping all over the ECPA and the fourth amendment. The government, both the Bush and Obama administration, is using ‘terrorism’ as a way to encroach on your rights, just as Senator McCarthy and J. Edger Hoover used ‘communism’ to do the same. Has it gotten to the level of McCarthyism? Of course not. But that’s because we pushed back on Bush’s warrantless wiretapping. And we need to push back on the NSA collecting our phone records and reading our email.

We have a right to privacy. That is one of the things we’re celebrating today. It’s just a small part of the amazing document called the Constitution. It’s what makes me proud to be an American. We can not let those in power scare us with bogeymen like Communism, Terrorism, or whatever the threat du jour is. That there are threats in the world can’t be argued. How we respond to them can be. If we give up our constitutional rights how are we any better than those Communist states we used to decry?

What makes America great, and today worth celebrating, are the freedoms we have. They are freedoms worth protecting. Sometimes that means you need to stand up and make your voice heard. Sometimes that means sacrifice and more inconvenience than signing an online petition (or writing a blog post). Do not let anyone diminish your rights.

Speeding Up Beauty Box Video

We’ve come along way from Beauty Box Video 1.0, which was pretty slow. It’s now as fast as any other solution out there, and BB still offers the easiest and highest quality way of doing retouching for HD, 4K, and film. That said, it still requires a render and there are various things that can slow it down. It can really slow FCP X down if FCP isn’t configured correctly.

What should you expect speed-wise from Beauty Box?

A minute of HD video should take from 3-10 minutes to render out on a reasonably fast machine. So let’s discuss how to get those faster speeds. If you have a fast video card (say, the Nvidia 680 in an iMac) and are seeing really slow speeds, make sure you read to the end were we discuss the configuration file BB uses.

After Effects: Beauty Box will render faster in AE than any other host app. This is primarily because of how AE handles multiprocessing. It’s far better than any of the video editing apps. It requires a fair amount of RAM to really take advantage of, but it can run very fast. If it’s possible, we recommend doing the Beauty Box pass in AE, and then bring the intermediary file into your editing app to cut.

Final Cut Pro 7 & X: If you’re using FCP X, turn off background rendering. Background rendering works great with basic filters, but when you have something that’s render intensive like Beauty Box background rendering will bring FCP X to it’s knees. Also, turn off scrubbing. FCP will start caching frames and, again, start rendering multiple frames in the background which will really make FCP sluggish. Generally, we recommend either applying BB first, and then turning it off as you’re editing, OR applying it last. Applying it last is the preferred way. You can take your edit, create a compound clip, and then apply Beauty Box to the compound clip. In FCP 7, a compound clip is called ‘Nest’.

Premiere Pro: Similar in some respects to what happens in Final Cut Pro. It’s not a real-time effects, so it’ll prevent the Mercury engine from rendering in real-time. So, again, you want to apply Beauty Box either before you start editing (and turn it off while you edit) or apply it as the last step after editing and color correction (recommended).

Video Cards: Beauty Box is accelerated using OpenCL. This means it’ll get a massive speed boost from newer Nvidia and AMD video cards. In practice, this speed boost can vary quite a bit. We’ve run into more problems with AMD cards than Nvidia, so we recommend Nvidia cards if possible. Although, usually AMD cards are fine, so it’s not a huge deal. It does tend to be a bit more of a problem on the Mac where Apple creates the drivers. The AMD drivers tend to be more problematic than Nvidia’s. Regardless of which video card you have, we recommend getting the most recent Mac OS (and staying current with updates). Apple rolls driver fixes into the latest OS, so if you’re using an older OS, it’s potentially a problem. If you’re on Windows, you can just download the latest drivers, so it’s less of an issue.

What video card to get?

We still like the Nvidia GeForce GTX 570 as being the best price/performance option out there. For video applications, the Quadro cards don’t offer a lot of benefits. They tend to be slower and you’re paying for features that are more applicable to engineering/3D apps. If you do a lot of 3D work, the Quadro might be a better choice (I don’t do much 3D, so I can’t comment on that). The newer GeForce cards like the GTX 680 and GTX Titan are great, but don’t necessarily offer the speed boost to justify the extra cost. They are faster, so if you’re looking for the absolute fastest card then the Titan or GTX 690 is a great choice. Both cards require a ton of power, so make sure you’ve got a small nuclear power plant as your power supply.

OpenCL Configuration File

Beauty Box creates a special configuration file for the video cards in your system. This makes a file that you can send us that helps us troubleshoot any problems and it also keeps track of whether a given video card is crashing when used with BB. If the video card is always crashing this is a good thing. However, sometimes you’ll have a random one-off crash and BB will disable OpenCL. This will cause a dramatic slowdown in rendering. The solution is to delete the configuration file. BB will then recreate a default file next time it starts and rendering speeds will be back to normal. But you need to know where it is to delete it, so here are the locations:

Windows: Documents\Digital Anarchy\DA_OpenCL_Devices.txt

Mac: Users\Shared\Digital Anarchy\DA_OpenCL_Devices.txt

Happy rendering… :-)

Beauty Box Video 3.0 Released!

The new version of Beauty Box Video for After Effects, Premiere, and Final Cut Pro 7/X is available for purchase or you can download the trial version. We’ve added a number of great new features, first and foremost is greatly improved automatic masking. This allows us to more accurately identify the skin tones and track them throughout the video clip. This means the retouching that Beauty Box does looks better than ever. Here’s an example:

Comparing the automatic mask from Beauty BoxNo automatic mask is perfect, we’re still picking up a bit of the background, but it’s much improved from 1.0

The other big new feature is the addition of preset Styles. It ships with 35 different styles to give your video a wide variety of different looks from a warm glow to a ‘day to night’ look. These are modifiable, so you can adjust the amount of smoothing up or down.

We’ve also improved the shine removal, improved the OpenCL support, so it should be faster on most cards, and made a bunch of other small improvements and bug fixes.

It’s a great upgrade and until June 30th it’s only $59 for Beauty Box 2.0 users ($99 for 1.0 users). New licenses are also on sale for only $149 (save $50!).

To purchase head over to the Digital Anarchy store. If you want to download the free trial and get more info, click here.

Oh, and if you’re wondering what happened to OpenFX support… we’re adding NUKE support and the OpenFX version will be released in a couple weeks. At the same time, we’ll also be releasing a brand new version for Avid systems! You can download the beta of both the OpenFX and Avid builds here.

The Unreal World of Retouching

I always find it interesting why creative directors feel it necessary to retouch certain images. A great collection of photos showing before/after retouching are below. Some obviously need it, like 50-something Madonna trying to look like a twenty-something. However, shots like the Jennifer Lawrence example are something of a mystery. Beautiful woman with a great body made to look rail thin, for no real good reason… other than the anorexic look is what everyone aspires to. Sad. Here’s a link to all the images (the two mentioned are below, but there are a couple dozen on the Imgur page):

Before and after photoshop

madonna retouched

Greenscreen Tips for Shooting Video

There was a question the other day on the After Effects List about tips for successfully shooting greenscreen. A couple good links were suggested (see below), but one that stood out was rotating the video camera vertically. If you’re shooting a person standing, and they’re going to be keyed anyways so you don’t need the extra space horizontally, use the wide part of the camera to capture more vertical resolution. It was also a reminder that shooting greenscreen is difficult even for pros.

Great tips from Jonas Hummelstrand:

http://generalspecialist.com/greenscreen-and-bluescreen-checklist/

and from the After Effects Help section!

http://adobe.ly/RNe3pz

More vertical resolution, anyone?

Vertical resolution is good for greenscreen video

NAB Trends

What’s Trending at NAB

Around this time of the year, you start seeing a lot of talk about what’s going to be released at NAB. It’s always interesting to look at some of the larger trends that are out there. Of course, what’s trending for Digital Anarchy is Beauty Box 3.0. The photo version just got released (see below) and the video version is not far behind. But beyond that…

NAB Plugins Software After Effects Final Cut Pro

There are some of the obvious ones:

Continue reading NAB Trends

Beauty Box Photo App Got to #6 in the App Store!

Top Free Photo or Video App in the App Store

Yesterday, our new iOS app, Beauty Box Photo, climbed all the way up to #6 in the Top Free Photo Apps! That’s 3 spots behind Instagram! Pretty impressive for an app that’s only been out for a month. (ok, in the screenshot it’s at #7, but it was at #6… you can trust us! ;-)

Given that there are over 19,000 photo apps in the App Store, getting into the top 10 is a pretty impressive feat. It requires a LOT of downloads, so we are really thrilled by it.

Even more exciting is that version 1.1 is coming out in a couple weeks, adding all sorts of cool stuff. 1.0 has gotten mostly 5 stars reviews and if folks liked that, 1.1 should be really well received. Stayed tuned for all the details once it’s available in the App Store.

The MacPro and Does Anyone Really Need It?

Apple has confirmed several times that a new MacPro is coming, so I believe them. There have been some good blog posts recently about this, notably Larry Jordan’s. The spat Apple has with the EU that has resulted in them not selling MacPros in Europe is mostly irrevelent. (Not completely, as you’ll see in a moment…)

So I’ll jump into the fray. Yes, I’m playing armchair psychologist here. I have no inside info and am making all this up based simply on having watched Apple intently (and been subjected to their whims) for 25 years as a customer and developer. Take it for what it’s worth.

So why the about face when it looked like the MacPro was done for just a year or so ago? I think the main fact is that Steve Jobs is no longer with the company.

Continue reading The MacPro and Does Anyone Really Need It?

Photo Manipulation, Contests, and Getting Disqualified

Good article here about how the photo that would have won the National Geographic photo of the year, got disqualified because the photographer used Photoshop to get rid of a trash bag instead of cropping it.

If you’re going to enter contests it’s a good thing to read the rules, but it’s almost a certainty that if you bust out the clone tool or use Content-aware Fill you are going to be disqualified. Obviously if it’s a photo manipulation contest that’s different, but most photography contests want you to do everything in camera, limiting adjustments to minor tweaks like cropping and contrast, things that were relatively easy to do in the darkroom days.

While you might argue what’s the difference between cloning a trash bag out of the photo and cropping it, it’s a very slippery slope. It rapidly becomes more about your Photoshop skills and less about your photography skills. If it’s a photography contest, then it should be about your photography skills.

Here’s the disqualified image, click on it to read the full article and see the original.

disqualifed Photoshopped photo

The CES Hangover

You wake up from a dream of dancing pink elephants being chased on a rollercoaster by planet sized mosquitoes… and discover that waking is even weirder. Welcome to CES.

Ok, maybe not quite weirder, but my god, does the world need 1001 makers of iPhone cases and headphones? That’s innovation? I know those kids in those iPod commercials looked cool and all, but really? 1001 companies? And why did the Postal Service have a booth with Elvis and Marilyn Monroe impersonators? Not to mention the man sized Qbert looking things standing around a table sized Surface device playing a kids game. And then there was the keynote. Fear and Loathing in Vegas indeed. No acid needed.

When they say ‘consumer electronics’, they mean it. It’s not just TVs, it’s every last little doodad that can be plugged into a wall. And in fact, there were many innovative items floating around, but my main interest was video related stuff, so I’ll chat about that. Although, the Audi booth was a photographers dream. Basically you were inside a giant softbox with a bunch of great cars. Very cool if you wanted to work on your car photography.

I still don’t think the TV manufacturers get it. Panasonic and Samsung sort of did. LG and Sharp not so much, and Toshiba didn’t even have someone that spoke english. The Panasonic and Samsung second screen and internet TV offerings were pretty well thought out as a way to control the TV and content. However, you get the feeling that Apple is going to roll into this space and change the way we think about what a TV will do, in the same way they changed what we thought a phone could do. The current offerings just seem lackluster, with the internet tacked on. Not actually rethinking what you can do with a big internet connected screen.

3D was mostly dead. Only LG had a 3D showing… a giant 30 ft wide/12 ft high screen showing headache inducing 3D graphics. Like 3D TV overall, it was a Fail. Good riddance.

UltraHD (4K) is officially here. From the content I saw I have a hard time believing it will get the response HD did. SD compared to HD was night and day. UltraHD is better, but not so much better that it’s a nobrain upgrade. I guess we’ll see. The manufacturers have given up on 3D, so UltraHD is the new 3D.

A few camera vendors announced cameras that could use apps. Samsung will use Android and Sony, of course, will use their own operating system. Sony _could_ do what Amazon does and tweak their own version of Android and create their own store. But, no, they’re going to roll their own and have 6 people develop apps for it. And yes, they’ll be releasing training videos for this new Sony OS on Betamax. Didn’t Sony lose like a trillion dollars last year? No idea why.

And really, the keynote was f’ing bizarre.

Nvidia GeForce GTX 570 in a Macintosh

All the speed tests we’ve done with Beauty Box on Windows show the Nvidia GeForce video cards to outpace their much more expensive cousins, the Quadros, significantly. A GTX 570 (~$270) is about 25-30% faster than a Quadro 4000 ($800).

Since Beauty Box can involve some render time, we’ve wished that Apple would authorize one of the newer GeForce cards for the Mac. No such luck. So we’re tired of waiting. We took a stock PNY GeForce 570 and put it into our MacPro. And lo! It works!

So… what’d we do and what are the caveats? This was not a 570 with ‘flashed’ ROM. This was just a straight up 570 which we use in one of our PC machines. Nothing fancy. We did need to download a few things:

– Latest Nvidia driver for the Mac, which can be found here: http://www.nvidia.com/object/macosx-304.00.05f02-driver.html

– Latest CUDA drivers for the Mac, which can be found here: http://www.nvidia.com/object/mac-driver-archive.html (as of this writing, v5.0.37 was the latest)

– If you’re using Premiere you need to update the cuda_supported_cards.txt file to add the name of the video card. In this case it would be: ‘GeForce GTX 570’  To do this, you need to go to the Premiere.app file, right+click on it and select ‘Show Package Contents’. Once you do that, this is what you’ll see:

CUDA nvidia opencl adobe premiere macintoshOnce that’s done, you are good to go!

Now that caveats…

Continue reading Nvidia GeForce GTX 570 in a Macintosh

Why Don’t TV Manufacturers Get It? Stop Making TVs!

It’s been a couple years since I wrote about no one wanting 3D and people wanting Internet enabled TVs. TV manufacturers still don’t seem to get what people want. We either want TVs to be the same passive viewing experience they’ve always been or we want them to be internet devices (or probably both at the same time).

If Apple comes out with a TV, I don’t think it’s hard to guess what it’ll be. It will not be a TV. It will be built from the ground up as an internet device with a big ass screen whose primary use is displaying content.

internet TV, google, apple, apps

There was a survey recently released that said less than 15% of Smart TV owners are using the smart features. This isn’t particularly surprising because most ‘smart’ TVs aren’t very smart, don’t have well thought out apps that take advantage of it, and still want you to use a remote. Why? Because TV manufacturers still think they’re selling TVs.

Let’s go back to what Apple would release… and if they do, all the other manufacturers will go ‘ooohh… that’s how you do it. (And I’ll point out that I’m not an Apple fanboi… but they do have a habit of releasing game changing devices, so I’m using them as an example.)

Anyways…  features of an Apple branded big screen internet device:

Continue reading Why Don’t TV Manufacturers Get It? Stop Making TVs!

48fps Sucks

Maybe the title of the post is overly blunt, but it’s true. I saw the Hobbit in 48fps, in 3D. Please don’t make the same mistake.

48fps, hobbit, peter jackson48fps. Looks great!

I have no idea if the Hobbit is a good film. The ‘soap opera’ look of 48fps combined with 3D was distracting and outright ruined many scenes by making them look like a low budget Saturday morning cartoon. The climatic scene actually works out pretty well, but for the first 2+ hours it’s an awful movie experience. Peter Jackson has gone on record as saying that 48fps makes 3D more enjoyable. Whatever he is smoking, please send some of it to San Francisco. 3D tends to brighten the image up to begin with and you add 48fps to that mix and the result is so bad it’s comical.

I was hoping the initial reports of the look of 48fps were exaggerated and due to viewing unfinished shots. I think it’s clear that in both cases it’s not. It looks like 3D humans suffering from the ‘uncanny valley’ effects. It doesn’t look like film, but it doesn’t look real either. It just looks like bad TV. With Hobbits. Maybe they can resurrect the Ewok Christmas special and shoot that in 48fps, 3D.

I realize there’s a lot of new technology out there and you have to test it out on something. But to test it out on a major motion picture? Honestly, I wish folks would just try to make better movies instead of screwing around with all this stuff (48fps, 3d) which doesn’t make the films look better and rarely adds anything to the story. In the case of the Hobbit, it really affected the story poorly.

I do think there’s some technology which will change movies for the better. The super high resolution cameras produce great looking imagery. Internet connected TVs will change the way we watch movies and how they get distributed. But 48fps is just crap. So thank you to Peter Jackson for proving that.

Looking for Avid and Nuke Beta Testers

As noted in our most recent newsletter:

Due to popular demand, we are porting Beauty Box far and wide. Even farther than you might imagine, but you’ll hear more about that over the next few weeks.

For now, we have builds of Beauty Box for Avid and Nuke that we think are working well. We need some folks who actually use these host apps on a regular basis to verify that for us. So if you (or someone you know) might be interested in beta testing, please drop me an email at jim@nulldigitalanarchy.com. Please let me know what platform/program you’re on, if you’ve beta tested before, if you’ve used Beauty Box with other host apps, and if you like fruitcake. We’re pretty flexible about who we allow to beta test, but I draw the line at fruitcake. With eggnog we have a don’t ask, don’t tell policy. Champagne and snickerdoodles are fine by us though.

We’re very excited about both these apps, so we’re looking forward to getting Beauty Box out there.

avid Media Composer, plugins, plug-in, Foundry Nuke

Using Plugins on Multiple Computers

Plug-ins with multiple=So you’ve got two (or 20) computers and you want to use Beauty Box (or whatever) on all of them.

This is always a tricky thing for software developers. On one hand we realize many folks have multiple machines and since they’re only one person, they can only use one machine at a time. We would like to allow them the flexibility of having it on a couple machines. On the other hand, if you’re a studio with multiple machines and multiple people we think that if our software is good enough to be installed and used on all those machines we should be paid for it. Making sure that happens sometimes gets in the way of how a single user is using our plugins.

Companies

When you buy a license of our software, you’re buying it for one user. If you’re a company with multiple machines and multiple artists/editors using those machines, then there’s not much gray area and you need a license for each computer being used. We offer pretty good volume discounts and site licenses for this type of situation, you can contact sales@nulldigitalanarchy.com for pricing.

There is one big exception to this… if you’re using After Effects’ network rendering. You do not need extra licenses for After Effects render nodes. You can install Beauty Box on as many render nodes as you want for free.

People (and, no, companies are not people. I don’t care what the Dread Pirate Roberts says)

If you’re just one person with multiple machines then there’s some gray areas. The software can be installed on a couple machines, but we use the internet to determine if the plugin is being used on multiple computers at the same time. So if you have a desktop and a laptop and you’re using one or the other depending on whether you’re at home or at the office, no problem. You’re good to go.

However, if you’re in your studio/office and trying to use both machines for rendering/editing at the same time, you may run into problems. If so, here’s what you can do:

1)      You can purchase a second license. We do offer discounts for second licenses. Contact sales@nulldigitalanarchy.com.

2)      Use the second machine as an After Effects render node. As mentioned above, you can use Beauty Box on as many render nodes as you want for free. So if the machine is just being used to process frames sent to it from another machine you shouldn’t have any problems.

3)      Our licensing is set up so that you can install on two machines, they just can’t be in use simultaneously.  The way we check this is via the internet. So if you disable the internet connection on one machine, then we can’t check it. This is a hack and technically violates the license. However, since the spirit of the license is for one user, as long as it’s the same person using the machines we’re ok with it.

4)      Render out the Beauty Box clip on one machine while working on another part of your project on the second machine. BB just gets watermarked on the second machine, so it’s still usable.

Like most of you, we’re running a small company. We try to be as flexible as possible, but if you’re making money using our software we would like you to buy the correct number of licenses. Please support the companies that make the tools you use and that help you be successful.

When Cats Go To NAB

We’ve been exhibiting at NAB since 2001 and one of the traditions is that any extra exhibitor badges we have get put in the name of our hard working mascots… Fierce Peanut and Molotov Cupcake (our two cats). They have not made it out to Vegas yet, but we don’t want them to feel left out so they get their own badge.

This has led to them receiving emails, including some amusing ones such as the one below inviting them to speak at a conference. While I’m sure Ms Peanut would be more than happy to speak, her expertise tends to be limited to mobile device viewing, particularly games and media for… well, cats.

Why, yes, we’d be delighted to speak at your conference. Will there be tuna hors d’oeuvres?

Don’t be a Grumpy, Old Photographer

I recently was chatting with a photographer who pretty much blamed all the ills of the industry on Moms in hot pants. Yep, that’s why he no longer goes to WPPI and why the photo business isn’t what it used to be. Moms in hot pants with their toy DSLRs undercutting real photographers. What IS the world coming to?

(ok, so this is from the Sony advert that’s very funny. See post from last week.)

I think mostly what he’s upset about is a new generation of photographers. I suspect when he got out of school there were a bunch of old photographers bitching about all these kids with their Canon AE-1s running around in bell bottoms pretending to be photographers and working for peanuts.

But change happens. A new generation comes along, new ways of marketing appear, and new cameras are released. Just because you think Twitter is the dumbest thing since the Pet Rock, doesn’t mean you don’t have to use it. (At least Twitter doesn’t limit who’s sees your posts like Facebook does now) Marketing has always been critical in photography and it’s even more so now. It’s just the way of doing it has changed somewhat. It requires a little more consistent engagement… like this blog. Which you’ll note I’m not writing on Facebook. I’ll post the link on FB, but because FB limits who sees it, it much more effective to do the writing here and link to it from the various interwebs.

If a few Moms with Canon Rebels are on the verge of sending you out of business, I don’t think the issue is the Moms. Hell, hire one of them. If you can get in with the Mom Mafia you’re golden!

And besides, given the amount of tradeshows I go to that are nothing but geeky guys, I’m having a hard time seeing what the complaint is about a little gender diversity (hot pants or no). But it’s no secret why you’re seeing a lot of women in photography… they tend to communicate better than men, have more emotional intelligence, and are excellent shooters. I mean, who do you think a bride is going to want to shoot her wedding? The energetic gal in hot pants or the grumpy, old guy? It’s all about being a good communicator these days, whether it’s on social media or during a shoot. Don’t be the grumpy, old guy.

Great New Tutorial on Shooting Greenscreen/Chromakey Photos

We’ve wanted to do a training video for awhile that touched on all aspects of Greenscreen photography. From the photo shoot to the file management and, of course, the keying. We recently had the opportunity to work with with Mike Price of Fairfield Photography to do just that. Mike has been shooting youth sports for years and uses greenscreens and Primatte.

Shooting Chromakey photograph on a greenscreen and keying greenscreen photos in Photoshop

In this 30 minute video, Mike touches on all aspects of Greenscreen photography: Shooting and light setup, managing photos, setting up actions, and doing the keying. If you’re new to chromakey photography, regardless of whether you’re using greenscreen or bluescreen, this is a great video. Even if you’re an old hand, it’s always great to see how other folks are doing it. So check it out!

http://vimeo.com/53470129

Sony Makes a Funny Video Making Fun of DSLR Users

Sony has come out with a pretty funny video about DSLR users who really shouldn’t be using a DSLR. The campaign is called: DSLR Gear, No Idea.

One has to ask… What is Sony going for here? This is only funny if you know something about DSLRs. The camera they’re advertising is a point and shoot with some fancy features, but they’re pretty much insulting the people they want to buy their camera. Seems like an odd teaser campaign… but it is funny!

I Feel the Need for Rendering Speed (or Why I Love My GPU)

We’re about to release a free update to Beauty Box Video (2.0.4… look for it next week) and figured it was time to talk about GPUs again. We’re seeing 500-800% speed increase using the GPU on newer graphics cards, especially Nvidia boards which seem to be more stable than AMD or Intel.

(You can get more info on Beauty Box and a free trial version HERE)

GPU accelerated plugin for after effects, Premiere, and Final Cut Pro

So where are we getting these numbers and how do YOU get them?

Continue reading I Feel the Need for Rendering Speed (or Why I Love My GPU)

Makin’ the World an Ugly Place (with Free Plugins)

In case you missed it, last week on Halloween we released a free filter called Ugly Box! The blog post is a little late for Halloween (although they are celebrating it in New Jersey today), but if you’re tired of all the election nonesense, there’s still plenty of time to use it to make Glenn Beck more interesting.

Ugly Box: a free plugin for After Effects, Final Cut Pro, and Premiere to make videos look worse!

You can download it here:

http://www.digitalanarchy.com/demos/ugly.html

I think one of the biggest surprises we had when we released Beauty Box 1.0 was that people kept asking us if it could make people look worse. Considering how much detail you can see on HD and how bad some people looked on HD, I didn’t really think there’d be a need for a filter to do that. But… we give our customers what they want though…

With Beauty Box 2.0, you could set Skin Detail Smoothing to a negative number resulting in, yep, Ugliness! It takes the skin texture, amplifies it and sharpens it making your talent either look a bit older or flat out hideous depending on their skin and the settings. Ugly Box, the fitler we’re releasing for free, let’s you use that aspect of Beauty Box. It’s a bit of a one trick pony, you don’t have all the control you do with Beauty Box, but it can definitely make the folks in your videos look a lot worse.

Anyways, all the details are below, so download it for free and have fun with it! I figured it’d be a great Halloween treat for all you visual effects artists and editors doing last minute scary videos (or election videos…). ;-)

Color Calibration

After some time off, I’m creating prints of my photos. At first, I thought this was a good opportunity to try Costco, which has been showing up at some of the photography tradeshows touting their services to pro photographers. Using Costco as a print lab seems like a strange idea, but I figured if they’re promoting themselves to pros… but, no, the quality is what you would expect. Pretty awful prints. Nevermind.

So let’s try Bay Photo. Good reputation as a lab… so I ordered a matted print from them. Good print, but this is what the corners of the mat looked like:

Bay photo matting

Seriously? Why even offer matting if you have zero quality control?

Looks like I’m doing this myself. Which meant calibrating the monitor and printer. I’ve got the ColorMunki for this purpose, but hadn’t used it for awhile. I’d sort of forgotten how easy it is use and set up. I have to say I love this thing. The Cinema Display and my Epson R2000 Printer are amazingly in sync. It’s not perfect… you sometimes have to calibrate the monitor a few times to get it right and I’ve heard it doesn’t work well with older monitors, but for me it works great.

One thing I discovered is that you should calibrate the printer with full ink tanks. Changing the ink can require recalibration. The Cyan was low when I did the initial calibration. I got 3 prints out of it before it ran out. Replacing it resulted in a color shift and recalibration.

Printing yourself is still a bit of a pain in the ass, it’s not the cheapest option, especially when you factor in your time. So I may end up printing with a lab anyways. But I’m always impressed when technology works the way it’s supposed to. The folks over at X-rite have done a nice job with the Munki hardware and software.

<shameless plug> If you’d like to see some of the prints, it’s Open Studios in San Francisco this month! I’ll have a few prints at SMAart Gallery on Sutter St. exhibited with Lily Yao’s ceramics. SMAart is open the first two weekends of Open Studios: Oct. 13/14 and Oct. 20/21, so if you’re in the Bay Area come on over. More info can be found here:

http://smaartgallery.com/

Why You Shouldn’t Shoot Video with a DSLR

There’s a myth going around that DSLRs shoot great video footage. They don’t. They are not video cameras and as such, usually result in sub-par video. If you want to shoot video, go buy a video camera. Stop listening to the cool kids telling you to shoot video on your non-video camera DSLR.

It is true that if you hook up a really nice lens to your DSLR and learn all the things that a video camera does that a DSLR doesn’t do (and that you now have to do manually)… you might get really beautifully looking footage. If you are an experienced videographer and/or filmmaker, you can get brilliant footage out of a DSLR. If you’re not an experienced videographer, the footage is just as likely going to be shaky, out of focus, and have bad audio. (Of course, some might say that if you’re not an experienced videographer your footage is going to look like that no matter what camera you use. ;-)

For the types of stuff most people are shooting on DSLRs, a sub-$2000 video camera, like the Canon XA10, will probably serve them much better. Actually, $800 handhelds are often more than most people need. I’ve got a Panasonic that shoots absolutely beautiful HD. I know, I know… it’s not as cool, but for most things, even stuff that Pro Video guys shoot, you need to jump through hoops to get a DSLR to do what you want. For example, it’s doubtful I’d let someone with a DSLR rig shoot my wedding. IMHO it’s not the appropriate camera for the job. Most people using a DSLR to shoot a wedding are probably trying waaaay too hard to be a ‘filmmaker’ and probably not as focused on just shooting my wedding. Besides, I really don’t want someone wearing a rig that looks like an orthodontic headset for Frankenstein wandering around my wedding.

DSLR rig for shooting videoUsing a DSLR to shoot video is sometimes an endeavor worthy of Dr. Frankenstein.

(Of course, if you’re into creating a monster, the RedRock DSLR rig shown is a good one)

All that said… it is possible to get great footage out of a DSLR and you get to use all those super awesome lenses. The gist of this post is to get you thinking about the hype that surrounds DSLRs and video. The cameras have a LOT of shortcomings that most people are going to find difficult to workaround, even video pros that are used to higher end camcorders. Before you go running off trying to use a DSLR for video, consider what you’re shooting and what you want to achieve. Examine the features that DSLRs are lacking, how that will affect what you’re shooting, and what you will have to buy to compensate for the missing video features. Use that as a guide to determine whether you should spend the time and money on outfitting your DSLR to shoot video or just buying a video camera that’s designed for video from the ground up.

YouTube Opens Production Facility in London, LA

YouTube/Google is opening a full fledged Do-It-Yourself production facility called Creator Space in London and, if rumors are true, Los Angeles.

What does this mean?

The most immediate result is that we will probably see better produced sneezing cat and laughing dog videos. This alone is exciting. Think of the cat videos we can get with a full cyc greenscreen! The possibilities stagger the mind.

Seriously though, there’s not a lot of info about it, but the promo video doesn’t give me reason to believe any post houses or production facilities should be sweating it too hard.

Continue reading YouTube Opens Production Facility in London, LA

Cloud Services – The Ugly

So I don’t get it when people freak out about cloud services going down. It’s the internet. Outages happen. Actually, they happen to any electronics.

Should they happen frequently? No, of course not. But Google Talk going down for half a day, Twitter, Salesforce, and Amazon all having recent outages have made it clear that you can’t trust the cloud 100%. Which is only to say that you should have backup plans in the event the cloud service you’re using or your internet connection go down temporarily (hello? Comcast? Anyone home?).

Furthermore, you should make sure all the data on your cloud service is backed up locally in the event the cloud service you’re using goes down permanently. This is a real risk if you’re using any cloud service that isn’t Amazon or Google. And even then, I’ve heard of Google deleting accounts by mistake in such a way they were unrecoverable. I back up all my Google docs once a week and download a copy of important documents as soon as I finish them.

While I have photos stored online, the originals are safely on a RAID 1 hard drive. I’ve written about the failure of Digital Railroad before, which was a photo storage site that went bellyup and gave users about 12 hours to download their photos before shutting off the servers. When startups go down, they go down hard since they usually try to hold on until the last dollar runs out. When the money runs out, you can’t pay for bandwidth fees, and then darkness comes (and the ice weasels. Beware the ice weasels).

So don’t get me wrong, I think the cloud is great. But as with anything, it’s good to know the limitations and be able to work around them.

Joy of Photography

One of the great things about running DA is that it gives me an excuse to buy fancy camera equipment and play with it. The latest subject I’m infatuated with is stars. No, I haven’t joined the paparazzi. I’m talking about the stars you can see when you’re 10,000 feet up on a rock in the middle of the Pacific ( the Haleakala volcano in Maui).

(c) 2012 Jim Tierney

Photography is absolutely amazing. It really forces you to be present in the place you’re at and the moment you’re there.

Continue reading Joy of Photography

E3: Game Look vs. Film Look

I was hanging around E3 on Tuesday, indulging my gamer geek side (games are a sister industry to the film industry so I get in on an industry pass, but I have no real legit business reason to go. It’s just fun.).

One of the things I’ve noticed about games is that the ‘look’ is still very much the same as it ever was. Yes, the polygon counts are higher and everything is in HD, but the look is the same. No depth of field and harsh lighting (usually either on or off). I was looking at a couple up and coming games and they just reminded me of Half-Life and every other game I’ve played. They look better, but they don’t look like film.

This is interesting, because films are starting to look like games and I don’t think it’s a good direction. I want games to start looking like films, not the other way around.

Where is this ‘Game Look’ for films coming from? I think it starts with 3D.

One of the smaller booths (You should’ve seen the xbox, ps3, wii booths) at E3 this year.

Continue reading E3: Game Look vs. Film Look

The Distortions of Retouching

Did you see the before and after shots that Britney Spears released? Can’t say I’m a huge fan and I would never have imagined I’d be mentioning her in the Digital Anarchy blog, but, yet, I just did. She released retouched AND unretouched photos of herself, and put them out there for comparison. The article breaks down the shots and what was changed. While I think it was a very worthwhile thing for her to do, I really wish she’d given a more intelligent quote and actually address the issue instead of saying ‘it was fun being shot in front of a wall of cotton candy’. sigh. Click here for the article.

It reminds me of the Dove ad that takes a model from walking into the studio, through the shoot, through photoshop, and out on a billboard. Amazing commercial bringing attention to the same issues.

This might seem like an odd conversation for a company that makes software to do retouching, like Beauty Box, to be promoting.

Continue reading The Distortions of Retouching

You’re creative? You’re a scoundrel!

Why would a dishonest person honestly report their dishonest behavior in an anonymous survey? Would a creative person label himself as a chronic paper clip thief to mess with such a survey? “Why, yes, I am the person that steals everyones pens!” Such are the questions that come up in a new report that links creativity to unethical behavior.

It requires some creativity to come up with the question of are creative people more dishonest, so are not the psychologists that did the study proved to be dishonest by their own study and not to be believed?

My problem with this study is the way it’s focused on those that are obviously creative (people working at an ad agency). The real problem, perhaps, is people that test high for creativity but have jobs that don’t on the surface require creativity, like accountants and bankers. Unfortunately when you say ‘people that are creative’ most of us think of artists, photographers, designers, etc. But the truth is that genius goes hand in hand with creativity regardless of what your field is… psychologists, scientists, cooking, banking, whatever. The ability to look at a problem in a novel way is important for the advancement of almost anything and requires creativity. Some of the most creative people I know are computer programmers… not a profession usually associated with creativity.

The study does a disservice to creativity, by not looking at other traits such as confidence to see if there are traits that have a higher correlation with dishonesty. It may be true that to be a mastermind of evil it helps to be creative. But to announce to the world that creative people are dishonest because of an anonymous survey and co-eds counting dots seems to me to be a ‘creative’ hypothesis.

Congrats to Rob Legato for Winning Best Visual F/X

Psunami, our old product for creating realistic water, was originally developed for Titanic. Rob was the visual effects supervisor on that film and played a key role in Psunami coming into being. Arete Associates was the developer of the wave technology originally and did the development for the film in conjunction with Digital Domain and Rob. He also was effects supervisor on The Aviator. After that film came out, he did a talk where he discussed the fact that the technology they needed a team of people to create in 1997 was available to anyone for $199 10 years later. A little trivia for all you visual effects artists out there. Psunami is now sold by Red Giant.

So congrats to Rob for his continued excellence in visual effects, this time for Hugo.

How To Not Be A Starving Artist

In the previous post I mention an article from NPR: Silicon Valley vs. Hollywood. In that article they quote filmmaker Tim Chey as saying: “We do it for the art, we do it because we want to tell our stories, express our stories. I, as a filmmaker, am not in it for the money.”

Awesome! Then why are you complaining about piracy? You want people to hear your stories. You’re not in it for the money. Pirates are just enabling more people to see your movie that otherwise would play at two arthouse theaters on each coast and then be forgotten. What exactly is the problem?

However, somehow I feel he’s not being completely honest about not being in it for the money.

The biggest problem that most artists run into is that if they want to be even remotely successful, they need to look at themselves as a business. This kind of sucks. Most artists became artists because they didn’t want to think about marketing, business plans, how to accept credit cards, who they have to pay off to get in a gallery, etc. Sadly, that’s the hard, cold reality of it. Either you learn how to market yourself, you give up a good chunk of your earnings to someone that will market for you (like a gallery), or you starve. (or I suppose you can subsist in a coffee shop making pretty patterns in the latte foam of hipsters who go ‘Wow, that’s cool. You should be an artist!’)

Continue reading How To Not Be A Starving Artist

Why doesn’t Hollywood get it?

NPR recently had a story about Silicon Valley vs. Hollywood. Hollywood suffers from a lot of piracy and the Valley enables some of it. Sort of. I get the feeling that Hollywood would rather the internet go away and then they wouldn’t have to deal with the change they’re apparently so scared of. They are certainly trying to legislate the internet into oblivion.

In the NPR article producer Gavin Polone says, regarding the fact that YouTube and the like are now producing their own shows: “And they will also start to look at this very expensive property as property, and they’re not going to want to have it stolen from them”.

Guess what? They’re well aware of much it costs to make content and they are definitely in it to make money. Could it be possible there are other ways to profit from content than the standard model that Hollywood has used for the last 50-80 years?

Continue reading Why doesn’t Hollywood get it?

Carpal Tunnel Injuries

What is referred to as Carpal Tunnel injuries is usually a collection of different injuries that are better known as repetitive stress injuries (RSI). They are serious problems that I’ve struggled with to vary degrees for the last 10 years or so. One of the other anarchists has had it longer and had to work through a severe version of it.

And I am pretty much the poster child for what happens if you ignore the possibility of RSI. Back when I first started working in software, I worked with a programmer who had to have someone hired to do his typing. The company had people come in and speak about ergonomics and how to avoid RSI. I ignored all of it. Clearly these people were just weak, and I, being invincible, would never suffer such things . Not so much.  Age and too many 16 hour days hunched over a keyboard/mouse tend to take their toll. The whole growing older thing is really a pain in the neck (literally).

People usually associate RSI with wrists, but in fact it can affect your arms, shoulders, neck, and back. If you’re on a computer a lot (and if you’re reading this most likely you make your living using a computer) it’s critically important that you pay attention to it. It’ll seriously affect your ability to use a computer and, if you’re a photographer, your ability to hold a camera for long periods.

Continue reading Carpal Tunnel Injuries

Graphics in Asia

So… what country do you think releases the most films? US? India?… Nope, Nigeria! This was one of the interesting tidbits that came out of the presentation Jon Peddie did at Siggraph Asia. Now, they’re not necessarily good films, but given the number of different languages (510!) Nigeria has, apparently they crank out a LOT of films (and, of course, it’s known as Nollywood).

No Nigerian Scam Here. They’re Making Movies!

This info was put out there to drive home the point that a lot of the growth we’re probably going to see in digital tools is going to come from emerging markets. This means opportunities for both software developers and artists. Granted, I don’t know how much software anyone is actually buying in Nigeria (or what they’re paying artists). However, I do know that some emerging markets, like India, are buying software and, at the higher end, apparently there are some well paid opportunities. I know several folks that are working in China, Singapore, and India.

Continue reading Graphics in Asia

Photography Capturing Changes in the World

Photojournalism has always been a huge part of photography. It has been capturing pain and suffering of conflicts for most of the last 100 years or so. What’s somewhat new is the prevalence of cameras in the hands of amatuers, be it mobile devices or DSLRs. Much has been written about this elsewhere, so I’m not going to retread old news. However, recent events across the bay in Oakland have brought this issue a little closer to home. (Digital Anarchy is based in San Francisco)

It’s been interesting and disheartening to see the stream of photos and videos coming out of the Occupy Oakland economic protest that basically got attacked by police a couple weeks ago.  No longer is it just people in far away places like Egypt, Syria, or China using this technology and social media to show peaceful protesters being fired upon and, but now it’s 10 miles away from where I live.
The true power of photography is it’s ability to capture dramatic moments, be they on the other side of the world or across a bridge. This is what makes it exciting to work within the photography community. Even if much of what passes for photojounalism these days is not taken by professionals. I find the thought of having a thousand cameras in a thousand places to be an incredible way of seeing what’s happening in the world.

btw… yes, I support the Occupy movement. However, I think it’s time they moved beyond the campouts and offered some solutions. This article is a good start…

Argh, Matey! Pirates!

Once every year or two something happens to make me get a bug up my shorts about piracy. Generally I don’t care much about it… most piracy is done by college students, software ‘collectors’ (people that just download it to have it but don’t use it), and other people that wouldn’t buy the software anyways.

We recently had the technical guy at a photography studio give us a call. Their primary business is doing greenscreen photography for clients and they use Primatte for it. He called to complain that they had recently upgraded to Primatte 5.0 and that he gets an error message when he tries to run it on all his machines.

All of Digital Anarchy’s software looks for other instances of the plugin running on a network and shuts down if it sees a copy with the same serial number. This studio, which makes their living doing greenscreen, had one serial number. In his words “We have Primatte 3 installed on all our machines and never had a problem, but now it looks like we’ll have to buy more licenses. Why?”.

Continue reading Argh, Matey! Pirates!

Wherein Jim Tierney rants and opines about After Effects, Premiere Pro, Final Cut Pro, and other nonsense