How We Streamed 24 Hours and Raised over $100k for #Dancember

Tags

, , , , , , , , , , , , , ,

Wow.

I’m absolutely still decompressing from this weekend, not to mention setup last week, all going towards putting on a live streamed broadcast for Dancember, an annual charity event by Benji and Judy of BenjiManTV and ItsJudyTime, respectively. There were a number of goals, including:

– Have the stream going for a full 24 hours via YouTube live

– Raise $100,000 for the charity, Rescue Freedom

– Have everything be high quality and professional

These goals themselves sound fairly straight-forward to the common reader, but I can tell you as a video professional that the first and last statement there were exceptionally intimidating to me at first. I’ve had plenty of experience streaming video, and I’ve been on production crews at the Gorge, but neither of those two types of events typically cross too often, and don’t kid yourself: 24 hours of live, high quality streaming video with multiple cameras and an amateur (relatively) team isn’t an easy task. I chalk up our success to one main thing: belief in the cause. Aside from that, there was lots of hours of preparation spent on this, from Benji, Judy and their team, to Guy and myself at DVE figuring out how to do all this, and finally to executing it.

Interested in how? Well here you go:

I was first made aware of all of this about a week before the event. Guy and I met with Benji and Austin (a very promising young videographer), and discussed the plan. Over the next few days Guy and I worked on a system for them, tested some streams from our office with YouTube live, and had a picture of how this would kinda work.

Starting Tuesday, it was time to get going. We had to make sure that Benji’s internet connection could support a 24 hour HD stream. We had to make sure our system, a NextComputing system especially designed for use with Wirecast for streams, could handle the load. We had to make sure the cables for cameras reached the right area, wireless camera worked, audio was solid, Google Hangouts could be brought in, music could play, and that we could handle all this tech by ourselves.

BenjiSetup2

I brought the streaming computer, a Blackmagic Design Studio Camera, some cabling and more over to Benji’s on Tuesday, set it all up, and started a stream…. only to realize Tuesday night it hadn’t worked. So after some research and text messaging Tuesday night, I returned Wednesday morning with some more various gear we’d need, and tried the stream again, this time on Benji’s account… and it worked for a solid 23 1/2 hours (I stopped it Thursday morning on purpose), which gave us great hope we were heading the right direction. To achieve the stream, we were using Wirecast 5, and linked it directly to Benji’s YouTube account, which Wirecast makes exceptionally simple. All we had to do was name the stream, make sure the settings were all there (video quality, allow chat, etc), and hit go live. We considered upgrading to Wirecast 6, which I made a few tweets about last week during their webinar, but since I hadn’t actually used it, we stuck with our guns, for which I’m glad. The computer has a Blackmagic Quad card in it, so it was a simple SDI run from the camera to the card, which went right out for the test stream and eventually the actual event.

Now, this is where things get a bit complicated. I can’t say there’s a “secret sauce” to doing all this, but it helps to have time to try a few things out. Basically, the Studio Camera with a mounted Rode Stereo Videomic X was in the living room as Camera 1, running SDI to a Smartview HD monitor, and then back to the streaming computer. We used this mic to try to keep the audio sounding natural while being better than an on-camera mic. If I did this shoot again, I’d ideally want to lav some people up, or maybe place a few more room mics I could control independently, but this was a great solution.

Second, we had a Sony a7 attached to a V-mount battery via a nifty SWIT connector that gave us power to the camera, as well as to the Teradek Bolt, used for latency-free wireless video. This ran into the Quad Card by converting the Bolt’s HDMI out to SDI with an Atomos Connect. (Any HDMI -> SDI converter would do it) Since the rig was heavy enough as-is, I didn’t want to add a mic to it, so it was used mostly as a no-audio B cam during dance segments, and when there were kitchen pieces (Benji’s cooking segment, etc) we used a Sennheiser G3 to mic him up.

The only other video source we used was for Google Hangouts when they’d interview other YouTubers, which ended up being the real challenge. Basically, the idea is that Benji / Judy use Hangouts to speak with Saccone Jolys, Knive Nulls, whomever else they wanted, and I needed direct audio/video from the hangout even though they wanted to be in the living room or upstairs. Now, if you don’t know, HDMI doesn’t run well past 15′, and not usually at all past 25′ without a DA, so the video run would be tricky, and that doesn’t talk about how to pass audio too. My solution was to have another computer next to my station that was also signed into Hangouts, listening to the full conversation and seeing all the video. I simply passed the video over via AJA ROI to my streaming machine, so the video was done. Next, I had an Edirol digital USB mixer for audio, and simply ran the headphone out to RCA into the board, giving me independent control from the rest of the stream. This would’ve been absolutely perfect, with one exception….

In the living room, they wanted music to play so they could dance, and they needed guests to hear it as well. The problem with this is that when in a Google hangout, this much sound out loud on both sides creates feedback, something I didn’t foresee in my testing. Our solution for this was to pipe the headphone out of their living room iMac (their Hangout computer) into a splitter, with one end going to a small stereo system, and the other going into a Sennheiser G3 transmitter, giving me wireless control of the audio from that computer as well. In the end, the audio going into the stereo needed to be rerouted to actual headphones to avoid any feedback, something I’ll avoid in the future by having a dedicated audio board for the Hangout machine in the garage and audio running to that from the living room, cutting off any potential for feedback, as they won’t actually be using the living room computer as anything but a monitor, essentially.

Another interesting challenge we had was communication. We brought a Lectro in ear monitor system, but for someone inexperienced in broadcast, the last thing you want is to constantly distract them. What then? Well, we used Chromecast on Benji’s living room TV and connected up a shared Google Doc with a laptop in the garage / production area, and basically used it as a teleprompter, showing them the active segment, video clips in cue, and any other info or notes they should have out there. Quite effective.

All in all, it was quite successful. However, being the guy I am, I’ve already worked out ways to improve for next year if I’m invited back, or if I ever need to do something similar again. I mentioned the way I’d improve audio for the Hangout system, but there’s more:

BenjiSetup1

1. Use Skype instead of Hangouts. Google Hangouts, while great, is very difficult to have any control over. You can’t go to split screen, audio dictates who is on the video, and things can get hectic. Skype has more control over a few of these things that will make it better-suited for this type of thing.

2. The NextComputing system did admirably, but we still lost the stream twice when the whole computer froze. I had it back up in about 3 minutes each time, but even that is unacceptable for mission-critical broadcast. Wirecast does let you save all the settings you create for a specific broadcast, so once it was back up I could just open that and begin streaming again. I never had the CPU usage over 80%, but even that is remarkably high to run a computer at for 24 straight hours. In the future, I’d use a different board, such as the Roland VR-50HD, and send a single signal into Wirecast on the computer, which would be doing nothing other than sending that signal to YouTube. Additionally, I’d consider sending a second signal through a Teradek Cube via ethernet for redundancy, something I could’ve done this time, but testing the stream successfully for 24 hours prior had me thinking I was fine. Live and learn.

3. As good as most of it looked, I didn’t have graphics / titles for interviews or info, which is a must next time. They weren’t asked for, but would’ve been nice. Creating them in Wirecast during the stream wasn’t really an option with the CPU constantly over 70%, but it’s something to prepare well in advance next time, as it could’ve added a lot of polish to the stream.

4. Better chair. Seriously. There are some things you just don’t think about when setting up for such a long broadcast, and seating was the last thing on my mind. after 8 hours you feel like your legs want to fall off. After 16 you feel like you’re being tortured. After 23, you are sore in any position you try to stand / sit in. Next year, instead of one of those cheap cafeteria style chairs, it’ll be a solid desk / computer chair, and yes, it’s actually this important.

So that’s really about it. Every piece of what I discuss here is pretty intuitive, easy to figure out and seems almost obvious, except when you put it all together for the first time and do it for as long as this.

There’s always, on every broadcast and every person, going to be things you’ll wish you could’ve done better, been better prepared for, or simply wish could’ve gone easier. Seriously, watch an NFL game on TV, you’ll see mistakes even there. It just happens. The trick is to be prepared for whatever could plausibly go wrong, and then prepare a bit more. If you haven’t hit any bugs during set-up, you probably haven’t tested thoroughly enough. Anyone I know in production, including myself, gets very uneasy if no problems arise during setup and testing, and I’d much rather come across them there than during a broadcast.

Finally, remember that unless you do work for the NFL, you probably don’t have access to the absolute top-of-the-line everything, and that’s ok. It’s not about how slick your setup looks, it’s how slick the results of it are. If you can make something amazing happen by solving problems with a little creativity, you’ve already succeeded. We work in a field that doesn’t really let you simply fail. You try, find a problem, get around it somehow, and remember that next time. And always, ALWAYS remember you’re in a team. Despite being the lead technical guy at this event, I definitely utilized the help and suggestions of those around me, especially Austin, to solve problems (such as the Chromecast prompter, his idea).

Now we ask, how much better can we make it next year?

Click here for my gear checklist for Dancember (prior to revision)

Advertisements

Camera Control in the new Blackmagic Design Software

Tags

, , , , , , , , ,

For their ever-increasing popular line of Blackmagic Design live video production solutions, many of which I’ve mentioned on my site at one point or another, there was very recently introduced a new software upgrade that actually gives you a lot more control and options than ever before, especially when paired with their relatively new Studio Camera.

A worker and friend, Aaron Hale, had the chance to update some of our equipment and be the first one to put some of these new features to the test. He show me over some of his thoughts, and after reading I decided to have him touch it up a bit so I could post it for you all here, on my site, verbatim. I hope you find it informative!

Setting up camera control on the Blackmagic Studio Camera

One of the latest features Blackmagic has packed into its newest round of updates is ‘Camera Control’. A comprehensive color control that also gives a show director the ability to focus individual cameras and match all cameras on a set together with a previously unattainable ease. Paired with the already powerful Studio Camera, ‘Camera Control’ adds an extremely useful tool to an already powerfully equipped setup.ATEMComputer1

First step to getting the camera control working is to update both your ATEM switcher and Studio camera. The specific update for the Studio Camera is the Blackmagic Camera update 1.8.1. (The Studio Camera’s USB port is on the bottom of the camera, which can make updates really annoying). Your ATEM switcher just needs to be updated to the latest switcher update 6.0. Once you get everything updated you’ll need to launch your ATEM Software that requires you to set up your switcher again with your PC’s IP settings. From here you’ll see the new control tabs at the bottom of the software control.

Getting the Studio Camera set up is a little tricky. In conjunction with your video out over SDI, you’ll also need a ‘program’ SDI cable from your switcher. The ‘program’ out gives you access to the newly installed ‘Camera Control’ (fiber optic cable also carries camera control commands, just add Blackmagic’s Studio or Camera Converter to the chain). Once you have everything hooked up and communicating with each other you’ll be able to go to the ‘Camera’ tab where you have control over 6 video inputs.

ATEMComputer2

Controls include lift, gamma, gain, contrast, hue, lum mix, saturation, along with iris control and complete focus controls. The CCU is styled like Da Vinci Resolve, and any one experienced with using Da Vinci will be able to transfer their creative color correcting skills over to live productions. Whether filming a live concert or shooting a music video, the potential to push color schemes to the extreme while ‘live’ gives even more options for a creative production. Changing light conditions, found at concerts or theatres, can also be taken into account and adjusted for even quicker than previously possible.

These new features give producers unprecedented control over the look and feel of productions that was previously unattainable at such low prices. I updated a Television Studio Switcher, BMD’s cheapest ATEM switcher, to carry out this test. For $4000 you could have a switcher with three Studio Camera’s that are all controlled by a central software switcher, and if your one camera operator short, you can still manually focus a stationary main cam for any situation. Pretty incredible.

Blackmagic really hit it out of the ball park with this update. Recently there have been a slew of good updates coming out of Blackmagic and hopefully they can keep addressing necessary firmware updates. But this is above and beyond in my opinion: it adds a totally new element to live mixing for all levels of productions. Hopefully this write up helps you see the potential and uses of the new ‘Camera Control’ feature on your ATEM/Studio Camera pairing.”  –  Aaron Hale

Video Crew for Sasquatch Music Festival 2014

Tags

, , , , , ,

My favorite updates to write about on this website are ones that come directly from being in the trenches. This one is no exception.

I was hired by Red Element Studios in Seattle, WA to help with their main stage video production at the Gorge during Sasquatch! 2014. Originally planning to just be an engineer/tech, I ended up being needed on multiple stations, and am very glad for that, as it gave me priceless experience on a true professional level, on par with any live music production around. It also taught me a few lessons, both technical and non, that I’d like to share (along with some photos and gear, of course)

Lesson 1: Be Versatile and Flexible

The whole reason I ended up keeping my spot on this team is the fact that I can do more than troubleshoot switcher set-ups or help set up a production. Knowing how to correctly switch and direct a shoot, how to operate a PTZ (pan-tilt-zoom or Robo Cam), and how to operate other cameras with manual zoom/focus were all vital functions I ended up being asked to perform. Now, I hadn’t operated a PTZ camera in any sort of professional capacity before this, but absolutely could now. For the rest: My experience in high school video production taught me what’s generally expected of a director/switcher, and even then, switching between 3-4 predetermined cameras is vastly different than switching between 8 live cameras and a graphics computer while you have to call out shots/check focus and iris for your camera guys. This is experience you can learn on the fly, but don’t be surprised if you’re overwhelmed the first few times out. Don’t be timid, don’t get frustrated, and don’t give up if you make a mistake or two.

As for camera operation, it can become easy to get pigeon-holed into one aspect of production or another: someone who mostly switches might let their camera skills get rusty, and pro camera guys might not be as skilled at overseeing multiple shots and directing/cutting at the right times during a show. But both positions should know something about the other. If a director doesn’t know how to adjust a camera’s focus, iris or WB, they won’t be effective at helping camera guys know what settings to have dialed-in. Likewise, if a camera operator doesn’t know generally what shots a director wants from each position of the shoot, they might constantly be going for shots that other cameras can get easier or better.

Lesson 2: Don’t Keep Secrets, Don’t Stop Learning

Chances are good that if you’re going on a shoot with a production company, there will be people of various skill/experience levels there. Sometimes there will be an obvious Greenhorn grabbing people their lunch, and sometimes there will be a jib operator with $40k worth of their own gear on site and an IMDB profile containing TV and features. You just never know. So if you’re the experienced one, a hired gun bringing your own gear and able to promise a certain level of quality, don’t be the one who won’t answer any questions or offer advice. It took you time, money, luck, and plenty of help from others to get to where you are, and everyone starts somewhere. Likewise, if you’re less skilled, don’t be afraid to ask questions or try to learn on the job. I spent a good portion of my time at this shoot explaining to people what I was doing, or why my troubleshooting worked for a particular issue we had. I also spent a while learning techniques from camera operators more skilled than I am in hopes that I’ll continue to grow. There’s always someone who can teach you something.

Lesson 3: Keep Your Cool

This sounds pretty intuitive, and also like an offshoot of the “be flexible” mantra, but it’s so important I wanted to highlight it, and there’s different meanings for it. If you’re fortunate enough to reach the level where you’re shooting high-profile acts, e it concerts, comedians or actors, there’s no time or place for an ultra-fan freakout. These are still real people, and when they’re away form the general public, they just want a breather from autographs, selfies and spotlight. If you or your crew ever want to be in that position again, it’s important to act like you’ve been there before. The second meaning for keeping it cool is that your job on this shoot WILL be stressful, mistakes WILL be made, and invariably a problem or two (or 12) WILL present itself. Being a professional in this line of work doesn’t mean you absolutely never make mistakes or never get overwhelmed; it simply means that you’re able to look at these instances and either fix them quickly, learn from them, or can find an alternate solution to get you through the woods.

Lesson 4: Enjoy Yourself

Don’t get me wrong, you are going to work HARD. Every morning we woke up around 8:30am to make the 30 min drive from Euphreta, WA to The Gorge. Upon arriving, we immediately started getting all set-up, checking cameras and screens, recording equipment, cables, everything. The first act on the main stage started at 1pm every day, and as the day went on, 45 minute sets turned into 1.5 hour sets, running all the way up to 12:30am that night. After the last act, we spent at least an hour tearing down cameras to bring in, putting batteries on chargers and preparing for the drive back to the hotel. To save you the math, this is roughly a 16-hour day for 4 days straight (including set-up day before day 1), with about 6 hours of sleep each night.

The crazy thing, however, is that you will be shooting a camera or switching for a band after hours and hours, and randomly realize, “Hey, I’m doing what I love right now. 10’s of thousands of people are watching the screens that my video is playing on. I’m on stage with a camera at the Gorge shooting video of Outkast in front of a sold-out crowd.” and so on. In those moments, I can’t stress enough how important it is to hold onto that feeling, because so many people out there would kill for half of the passion and satisfaction our career gives us as creative professionals. Never take that for granted, and never shut-out those moments where you’re reminded of how much you love doing what you do.

Misc. Suggestion:

One of the issues we came across a few times was bad cabling. Luckily our crew had enough people to where we could spare someone to run cable for a camera if needed, but after the first time, we decided to run extra cable lines to the more difficult areas, so we had backup. I strongly recommend this, especially when you have other crews on-sight who could tamper with or damage your runs, intentionally or not.

So, those are some of the bigger, more general lessons I took from this experience. I’m going to include a few photos here to give you a taste of how it looked, and as always, if you have any questions about the gear or the shoot in general, hit me up in the comments section or through the Contact Me page.

Photo May 23, 9 40 09 AM Photo May 24, 12 21 17 AM Photo May 24, 3 27 18 PM (1) Photo May 23, 10 22 48 PM Photo May 23, 9 10 37 PM Photo May 23, 9 02 21 AM Photo May 23, 8 47 04 PM Photo May 23, 5 38 07 PM Photo May 23, 4 14 12 PM (1) Photo May 23, 2 22 31 PM Photo May 22, 8 12 42 PM Photo May 22, 7 43 49 PM Photo May 22, 3 43 17 PM Photo May 22, 3 31 29 PM Photo May 22, 3 15 27 PM Photo May 22, 1 28 38 PM

A look inside the production RV

A look inside the production RV

New 4K Reflecmedia Footage for Keying

At NAB 2014, I was once again present to help run the Reflecmedia booth. This year, however, we decided to bring the demo into 2014, and overhauled the booth for 4K: Blackmagic Design Cinema Camera 4K, 4K Production Studio switcher, and a 4K monitor to show the composite. From the crowd response, it was quite a success.

The main reason we wanted to do this is simply because of cameras and deliverable content going either 4K, or being shot in it and then down-converted to HD (did you think you’d read the phrase “down-converted to HD” by this year or sooner?). Basically, people need solutions that hold up at 4x the resolution than in the past, and we wanted to make good on the fact that Reflecmedia as a green-screen solution is perfectly at home in this new detailed spectrum.

So, one day on the stand, a lovely girl approached who accepted our offer to record some of the blue screen footage she stood in for, and now Reflecmedia has released it as a sample, so you can try your hand at keying a 4K video. Sound fun? Just click over here and you can download the file and try it out!

Video Series: How to Connect your Blackmagic ATEM to your Network or Directly to your Computer

Tags

, , , , , , , , , ,

So, between work and my blog/YouTube channel, I end up troubleshooting one type of issue more than any other when it comes to the ATEM style switchers: connections. Whether it’s your first time connecting it out of the box, or your first time moving it from a network connection to a direct one, it can be tricky to either find the right settings on your computer or adjust them properly on your ATEM. With that in mind, I present three separate videos that address these issues. Video 1 deals with setting up a connection over a network (using a router), Video 2 is about directly connecting your ATEM to your PC, and Video 3 is directly connecting it to your Apple.

These videos assume you have connected up the ethernet and USB cables already, and focus solely on the software/computer side of this. I hope it helps!

Video 3 coming soon…

Camera Comparison: AJA CION vs. Blackmagic Design Ursa

Tags

, , , , , , ,

Today marks the release date of my second article on HDSLRShooter.com, another industry website I write content for. I tend to wait to write articles until something big comes along, and something with a wide appeal, at that. In this case my trip to NAB 2014 put me in position to be in the room as the first people saw the brand new AJA CION camera body, and then the new Blackmagic Design Ursa only a few hours later. What I saw was one company who has successfully created exciting cameras for a couple of years now, and another company new to the game who took years of ideas and thoughtfully produced a very intuitive camera. Let me be clear; BOTH cameras have uses and niches, and you honestly couldn’t go wrong either way at the price point. What I saw in the CION however was the next stage of evolution for the thousands of users out there still holding onto their HPX-500 and already have all of the accessories for it.

I hope the breakdown I give helps you out if you’re considering either option, and I’m definitely open to feedback or additional questions you might have after reading.

Read my review on HDSLRShooter.com here

Comparing Real-World Experience Between Two Switchers

Tags

, , , , , , , , , ,

Let me preface this post by saying one thing: I do not believe there’s such thing as a one-size-fits-all solution when it comes to video production if money is any object. Any product out there is going to focus on one main aspect, be it versatility, image quality, price or any number of other things. For the type of streams we tend to do, however, I believe we’ve truly found our Elysium: the Roland VR-50HD.

I’ve written about this unit a couple of times now in various capacities, but while I’d tested and played with it in-house, last night marked the first time I used it in a production. We were tasked with streaming HD video for a SEO meetup in Seattle, WA being hosted by Add3, featuring a presentation by Matt Brown of MOZ. If you’re interested in seeing the presentation/stream itself, click here to see the archived shoot.

When doing other shoots in the past for Adobe or the Final Cut Pro User Group meetings, we’ve typically used the Blackmagic ATEM family with varying levels of success. The image quality is top notch, the price point is fantastic and it does offer a lot of power. The main issues we had with this setup, however, came when trying to import the presentation computer directly into the switcher, and pass it back out to the projector. We’ve used the Edirol VC-300HD, AJA ROI and even an Atlona converter, and depending on the computer used itself, sometimes we still couldn’t accomplish this.

So, for this one, we needed to ensure it went off perfectly, and at the last minute decided we’d go with the Roland (still bringing all the converters, just in case). Our standard setup time for a 3 camera + computer source shoot is usually around 1.5 – 2 hours, and again, sometimes we still didn’t fully succeed. So, we arrived 2 hours early with every piece of gear we thought we could potentially need. Using the VR-50HD, setup only took 20-30 minutes with everything working and fully tested. If you consider that there were 4 employees on sight for this production, saving at least an hour on set up per shoot could begin to make up the extra price tag very quickly.

Roland VR-50HD with Sound Devices Pix 240i Monitor and Recorder

Roland VR-50HD with Sound Devices Pix 240i Monitor and Recorder

So for this shoot, our cameras were: Blackmagic Cinema Camera 4k, Blackmagic Cinema Camera EF Mount, Blackmagic Pocket Cinema Camera (room cam), and one source for the presentation computer (Macbook). The whole stream was brought into the Sound Devices Pix 240i via SDI, and then passed-through to our streaming computer using a Blackmagic Design Quad card and uploaded via Livestream Producer. The weird thing about this particular setup is that the VR-50HD doesn’t work in standard 24 or 30 fps formats, instead using 59.94 or 50. The Blackmagic Design cameras don’t have a native 50 or 59.94 frame rate, and in this case we could only get them to work when set to 25p. It was almost enough to make us switch back to an ATEM, but ended up not being an issue. Still, it’s something to consider if you have a producer that demands 24p, for example.

Finally, for the presentation computer we had experience with another issue in the past; even if we found a converter that nicely brought in the computer signal to the switcher, often times it kicked it out to the projector with a green or pink washout of all the color. What this is telling you is that the color space being used by the converter isn’t translating to the projector correctly. Using the VR-50HD, this happened at first, but within the menu you can manually tell it which color space to output, in our case we needed RGB 0-255 instead of whatever the automatic was setting it to, and by pressing maybe 3 buttons we had a perfect signal on the projector as well as for the stream.

Again, it’s my opinion that there’s no such thing as a one-size-fits-all answer (unless you can drop $50k-$100k on an ultra-pro switcher), but considering many of the headaches we’d encountered for productions like this, we’ve finally found our solution for this. Of course, when it’s time to use more than 4 sources we’ll be back to the Blackmagic, as the VR-50HD can only run 4 active at one time (though you can switch between inputs on each channel).

All in all, I’m really just hoping some of this information helps anyone out there, and if you have specific questions about the switcher, production, or anything else mentioned, please drop me a line!

HDSLRShooter Contributor

So, I’ve been sitting on this news for about a month now, patiently awaiting the right time to let everyone know: I’m now a contributor for HDSLRShooter.com , a great website keeping you up-to-date on all things photo/video related with your camera, and being a shooter in general.

My first article is an elaboration and expansion on a post I first created here on Jesse’s Gear, which might become something of a theme. I love writing on this site, and I try to keep my posts concise and bite-size so they can be done quickly (while the info is still hot), and sometimes just to get ideas or news out there. When writing for a much bigger website however, I’ll be spending more time filling-out articles with experience, technical data, and a little more personality. I truly hope you’ll still enjoy coming to this site for news and bits about new and niche gear, but I also hope you’ll check out HDSLRShooter, where I hope to continue to provide great info and insight in this rapidly-changing industry.

TV White Space Radio System Approval and your Wireless Microphone System

Tags

, , , , ,

Yesterday, someone posted in a Facebook group I belong to this article: TV White Space Radio System Approved by the FCC for Sales in the USA.

The article discusses the advances in wireless broadband internet broadcasts, and the frequency ranges that this affects. Basically, they’re discussing upcoming additional traffic in the 470MHz to 698MHz range, which happens to be exactly where almost all wireless microphone systems in the US operate.

Now, this isn’t the first FCC allocation that has affected this industry, with 2010 still somewhat fresh in many audio technician’s minds. But what does it mean for you now? First, an excerpt from the article:

RuralConnect delivers unprecedented broadband connectivity by utilizing “TV white space” frequencies, 470 to 698 MHz in the USA, with superior signal propagation characteristics. These vacant UHF TV bands – hence the term ‘TV white spaces’ (TVWS) – were recently opened by the FCC for free, unlicensed public use, a major development that holds great promise for bringing broadband internet to long-underserved rural and remote populations.

Sounds pretty daunting, right? Well, maybe. After reading this, I got in touch with my guys at Sennheiser, and after a few hours of email tag, received this response last night:

“Yes, this is true.  TV Band Devices (TVBD), also known as White Space Devices (WSD), are one of the upcoming technologies which the 700 MHz Digital Dividend made a reality.  They operate just as wireless microphones do, using white spaces.  These devices are mainly targeted at bringing broadband internet access to rural areas.

If preventing local wireless microphone interference is a concern, operators can utilize the designated TV channels for wireless microphones and/or register with Spectrum Bridge (
www.spectrumbridge.com)  when and where wireless microphones are to be operated.  In each local market, the FCC has allocated two TV channels which TVBD are not allowed to operate on.

The good news is that these devices are mainly operated in rural areas, hence the availability of spectrum should be greater than in densely packed metro areas due to the fewer terrestrial TV stations.  I am copying [name withheld] on this email, as he is very familiar with the changing RF landscape in the aftermath of the spectrum auction, and may have additional pertinent info.”

So, there’s a good chance it won’t affect you. There’s a good chance that if it does, you can proactively do something about it. Basically, if your living revolves around your A, B or G block wireless kits, I’d recommend checking out the Spectrum Bridge site, and generally keeping an eye on this story for now, but I wouldn’t worry too much.

EDIT: I received this additional response today from a Sennheiser lobbyist for Washington, D.C. –

As [retracted] pointed out, there are currently two TV channels in each market reserved for mics that can’t be used by Carlson’s product and other forthcoming white space devices.  These are the channels all mic operators should use first.  If more are required (for example if operating more than 16 mics or monitors), additional channels can be reserved for performance times by:

1) licensed users (broadcasters and content creators) through direct access to the database system.
2) unlicensed users can request access to the database system from the FCC, 30 days before their performances begin.
Spectrum Bridge is one of the four approved databases.  More will be available.  You only need to register with the one of your choice. They are all linked and will reflect valid registrations system wide within 15 minutes.

A New Player in Streaming Media: The Paladin

Tags

, , , , , , , ,

I’ve had the fortune over the last week to be given a demo unit of a new breed of streaming device; an all-in-one streaming package that combines HD video capability and power with extremely portable versatility. It’s called The Paladin, based out of my local city of Seattle, WA.

While I assume our location is one reason Paladin sought us out for testing, they also know that we have good experience with multiple methods of video streaming. As a company launching a new solution into a still-new market, they wanted to know how it stacks up against other methods out there. Since I’m going through this process of discovery for them, I’d like to share my findings with you on the internet as well.

First, about the Paladin itself: It’s housed in roughly a 12″x8″x3″ case, with the custom-design computer using most of the case. It contains a Blackmagic Design Quad card, Wifi and ethernet connections, HDMI, multiple source and audio inputs, as well as USB. The Quad card provides up to four SDI inputs/outputs at once. All this is tied together using a Windows 7 platform running Wirecast.

It doesn’t provide an out-of-the-box monitor, keyboard or mouse, but these days you’ve probably got a small graveyard of abandoned and discarded components you could use for that anyway.

“So, when would I use this beefed-up computer for streaming?”

I’m glad you asked, arbitrary voice questioning my blog.

The wireless feature with Wirecast is the first awesome aspect. You can take any IP ready camera or run a camera through a Teradek Cube Encoder, and via wireless the Paladin can add these as sources. Just let that sink in. You could, in theory, get a bunch of people with iPhones connected to the venue Wifi, using their phones as IP cameras for a live stream. The live music industry alone would make this a smash success. Want better quality? Grab and camera capable of HDMI or SDI and a Teradek Cube encoder, and you can shoot that signal straight to the Paladin w/ Wirecast via Wifi.

The second main feature: Paladin has developed their own iPad app to run the switching software. I’ve successfully been able to put this to the test, but the main issue people have had with the Strata App for Blackmagic Design software controlling is it’s not a mission-critical solution. I have to agree, despite the fact that I enjoy using it and it’s performed well for me, it’s not something I’d choose over my broadcast panel hard-lined into my ATEM. When it comes to production you’re being paid for, it’s not terribly ideal to rely on an iPad and a wireless connection, but it’s great as a first or additional option, or even for a producer/director to have just in case.

“So, if I have to use an external monitor and/or keyboard, is it still that portable?”

Definitely. In fact, we dropped a few bucks on this XBOX – Halo portable case with monitor / HDMI in, and the Paladin fits perfectly in there, making it extremely ready-to-go.

“How does it compare to flypack and ATEM style solutions?”

While I understand why you would ask this question, I honestly believe they’re two different markets. If you are considering an ATEM switcher in a travel case, you’re likely looking for 6-10 sources, external audio control, a heavy-duty computer or Teradek for encoding, etc. (That, or you’re hoping to get there soon) If you’re looking at an option like the Paladin, it’s my opinion that you’re planning to mostly be a one-person director/engineer, you’re happy with </= 4 sources and you want a streamlined process that eliminates more of the “moving parts” of component-style flypacks. While I could probably direct and engineer a shoot with <8 cameras and manage audio, that’s a pretty tall order for anyone if any amount of troubleshooting needs to happen, and I certainly wouldn’t feel I was playing it 100% safe.

“Why not just kit-out my current computer for this?”

Now, this is probably the toughest question to answer, and I’d have to say: it depends. Honestly, I believe if you have a small computer body with the motherboard, processing and ram power offered in the Paladin, you could very feasibly build a comparable system. But even with all that,, you still need the Quad card, Intensity card, Wirecast license, and you still don’t get the Paladin software that way. So, in my head, doing the math of my ability ripping apart computers (almost non-existent), building them, gathering all the components, and bundling it into a nice and optimized package like this becomes a real attractive option.

Besides, all of the above implies you even have a computer body that’s not a laptop. So many creative professionals I know rely on their laptop, and either have it for live production or their home desktop edit machine, so it’s rare that people these days have a tower they could just Frankenstein into something like this.

The idea of all of this is to introduce you to a new solution from a local company I feel are going to be contributing and innovating in this industry for some time, and I definitely enjoyed my time testing out the Paladin.

In a weird way, I even like the Black and Pink color scheme they’ve got going on.