Monday, August 27, 2012

D5100 @ shrubbery

I got a new camera this summer, Nikon D5100. Here's a test video i did last weekend... I'll just copy the text from video description here, lazy me...



Some tests with my new Nikon D5100 (and Tamron 11-18 mm wide zoom), first outdoor footage. I tried to press the camera a little, with high contrast / high detail scenes.

I also have Nikon D90, which is a nice still camera but the video quality is pretty appalling. As i have some decent Nikon glass, when i wanted a DSLR with usable video quality, i went for yet another Nikon body, D5100.

The overall impression is that D5100 is less serious a camera, lighter, feels more plastic, has way poorer battery life etc. For stills, i prefer D90, even if it's older and has less resolution etc. D5100 makes for a decent 2nd body, and sometimes the extra resolution can be useful. And the adjustable LCD is a very nice feature.

As far as the video quality goes, the resolution is decent, even though it clearly resolves less than the full 1920x1080. A huge leap from D90. The camera handles highlights and overexposure pretty decently and i like the colors overall. There's a shot at the end where i adjust the exposure from over to under, this should give an idea how the camera handles these. The camera unfortunately has rather severe moiré on fine details, seen i.e.in the ocean surface as crawling blue-red artifacts. These are partially hidden by Youtube compression and look worse on the camera originals.

The other thing i tried out was Tamron 11-18 mm wide zoom. It's fun beast, 11 mm is pretty darn wide but it still handles the geometry pretty well, without visible distortion. There are a few shots at the end of the clip where i walk around with the wide lens camera handheld (with the tripod attached to give it a little more mass for stabilization). Pretty wild even with such simple moves.

Camera settings: Picture control Neutral; Sharpening 0; Contrast -1; Brightness +; Saturation 0; hue 0; ISO 100; Auto control ON; Max. sensitivity ISO 1600; Min. shutter speed 1/50; Aperture priority mode.

With these settings, i get semi-manual operation. I first choose the aperture. When i switch to live view camera chooses the rest of the exposure parameters automatically, but they can be adjusted with the +/- setting to arrive at the desired value for ISO and shutter speed.

The camera will use ISO 100 with the current f-stop, and the +/- setting varies the shutter speed. If the shutter speed hits the threshold value of 1/50 when increasing exposure, the +/- setting will then adjust the ISO until the second threshold of ISO 1600 is reached. After that, the +/- will again adjust the shutter speed.

In a vast majority of cases, the result is the same i would have used with full manual control. When i hit the exposure combination i like, i just hit the AE-L button to lock the settings. This is a bit flimsy and backward way for adjusting the exposure, but it still works well enough.

Overall, i'm pretty happy with the purchase. This is not the best video DSLR, but it's pretty affordable, offers semi-decent image quality and most importantly to me, i can use my existing Nikon glass with it.


Friday, August 24, 2012

The Pi is in the closet

I've totally forgotten to mention Raspberry Pi, a barebones Linux computer board that is about the size of an Arduino board (smaller than a pack of cigarettes), costs a few dozen bucks, and comes with decent connections, including HDMI etc.

It has a decent amount of computing power (similar to a modern cell phone AFAIK), so it can even process live HD video. Could be pretty cool alternative for the brains of Moco, or i.e. a standalone camera tracker together with an Arduino / IMU, for example.

It's rather nifty i must say, and i have one in my closet. Still unopened, but i'm eagerly waiting to play with it...

Saturday, August 18, 2012

It's been a while, part deux a.k.a. research and development

It's indeed been a while, mainly because there hasn't been much of new development on the hardware side.

Or maybe just because i've been lazy...

Here's the beef. The hands-on work with the gear is on a hiatus currently, as we're in the midst of a move to new studio, and have been for the last few months. All the stuff is currently stacked in boxes in a huge pile. The new studio will be pretty cool - we will have a much bigger green screen, with two cyc walls.

While the physical gear stuff has been on a halt, i've been looking into software side of things. Not that much for the motion control stuff, but rather for a virtual studio (though they're related).

First of all, i've learned a bit of c++. I'm no guru, but i've gotten my first small programs to compile successfully. It's a start.

First dabblings...


Virtual studio needs a few building blocks (apart from the obvious green screen, camera etc. hardware) - a block to tell where the camera is, a block to get the details of it's lens, a block to ingest the video, a block to create the 3D background images, a block to remove the greenscreen and composite the talent to the BG and finally the code to tie all these together to an user friendly application.

A lot of the stuff needed is open source, and could probably be hacked together with some glue code. So far so good.

I think the most complicated issue is to accurately retrieve the camera position and rotation.

I've toyed with the idea of attaching a Kinect sensor to the main camera. Some source code for Kinect Fusion like stuff is available, but incorporating it isn't necessarily too easy, to say the least. I already got the Kinect, so i will play around with this stuff for sure.

Another route i have considered is using a 9DOF IMU - there's some Arduino compatible ones available, they are plenty good enough i.e. for UAV drones, but i'm not sure whether they are stable and accurate enough for high precision tracking that's needed for the camera. I'll probably buy one and test it in practice, as well as the cheap PS Move controllers - which work i.e. with LightWave3D's virtual studio tools (which i definitely need to look more into too, by the way).

Optimum case would probably be something like what iPisoft does for mocap, tracking the production camera's position using multiple cheap cameras (namely PS eye), which is similar to what many established virtual set manufacturers do already. Unfortunately, this all is commercial software and thus there's no way to get the source, and i've had a bit of a hard time finding even whitepapers on the subject. I / we will very likely purchase iPisoft at some point though, to enable markerless motion capture of actors in our studio - a topic definitely worthy of it's own post, but that's for later.

Although OpenCV graphics library includes some camera tracking algorithms, they are not readily suitable for this task, as far as i can see. But it can probably be used for getting the video frames and processing them on the GPU - Open CV has bits for i.e. calibrating the lenses etc. which should come in very handy. I'll probably need to write my own keyer, and so far i've already written a rudimentary one that works in real time, at least using a webcam. I've earlier done a more complex keyer for After Effects using Pixel Bender language (shame on Adobe that it's discontinued in CS6).



While all this happens to the video, the camera's parameters are sent to a 3D engine that will render out the virtual scene from the real camera's point of view. The most likely candidate for this might be Ogre, which is open source.

The last bit needed is the glue code to make this into an actual application (or a suite of applications).

How hard can it be?

Pretty hard i'm afraid, but hey, some others have succeeded in doing that stuff, so even if i don't, it may be an interesting ride to try ;-)

Monday, February 27, 2012

It's been a while

Routing the cables through the bearing.
...since the last update. That's, well, mainly because there hasn't been that much progress to report on. Been rather busy with Real Work (TM). I did manage to get a bit of time to work on the moco head a while ago, the current assembly is a kind of an hybrid of the metal and plastic ones: i used some steel bits to reinforce the plastic construction.

Further, after a pretty long adventure in the belly of the custom and postal offices, some more parts arrived. Namely, the motors and control board for the moco - i put the original parts to the CNC machine. The Domo*pes board has the motor drivers built in, which is rather convenient. So, the moco head now has the motors, as well as the necessary power transmission, belts and pulleys to make it work.

While assembling, i noticed some design silliness in my original plans - nothing that would stop the head from working, but stuff that i will definitely redesign for future models, if i ever get that far. Again, the lesson here is that what looks good on the computer screen may work, but it just as well may not.

Head on top of the temporary base.
In other words, planning ahead does reduce the number of mishaps, but not eliminate them altogether, with my engineering skills at least... in this case, the way i thought i'd secure the motors (just by the friction caused by sandwiching the mounting plates to the frame) was way too flimsy - i added extra bolts to tighten it up. A somewhat working temporary solution, but far from optimal.

Some things worked as planned though: routing the cables through the center hole of the large aluminum bearing at the base of the moco head still feels like worth the effort. This way the cables are not a problem when i.e. panning the head more than 360 degrees. Of course, even with this configuration, the cables will get twisted if the head is rotated too many times, but i don't really think that is a big issue in practice.

Also, i did assemble a temporary base for the head, well, to some extent. No wheels or motors etc. yet. Just the frame. This is actually not at all how i planned it, but might work for now.  The good thing with this new configuration would be that it'd run on a narrower track, the same width i use with the manually operated miniature dolly. This would be a big plus for compatibility, obviously.

Sunday, January 8, 2012

Not the droids part deux.

Today i had a few hours again - this time, i assembled the plastic version of the Moco head (left). Again, this is not the final version, it's missing motors, belts, pulleys and so on, as well as the whole base. On this pic it's resting ontop my manual mini dolly - which is essentially just a 75mm cup for a video head with wheels. Simple, but works perfectly...

Tuesday, January 3, 2012

It's alive!! (Part One)

Yep, the CNC stuff works now, more or less. After a long night at the studio, i managed to get the thing to do what it's expected. So, it not only moves controlled by the computer, carving out stuff, it does it with at least some degree of accuracy - a  supposed 10 cm move is a real 10 cm move, also according to the tape measure. Also, i found the right combination of software to make the machine accept GCode, the standard format for this type of machining.

That was not exactly a trivial task - i tried dozens of combinations that didn't work. For example, there are many flavors of the controller board i use, Arduino. A piece of software that's said to be "Arduino compatible" may or may not work on any given version of the board. As the software is mostly written by hobbyists, it may easily be that the author of the software has only tried it with one hardware combination.

The board i used for this machine is Arduino Mega 2560. It's bit of an overkill for this task, so it's not too surprising that most CNC software is originally written for lesser (and older) Arduino versions. There's also quite a lot of software for 3D printers out there, which use the same principles as this machine (XYZ motion, GCode instructions), but with the addition of controlling the extruder head which is kind of a hi-fi version of a hot glue gun, and other extra stuff. I tried those too, but honestly my skills (and especially patience) were not up to the task of bypassing all that excess code to make the software work with my crude barebones machine.

I don't actually remember the exact combination i finally ended up using, but i'll try to document that when i get back to the studio. Also, some coding was needed to make my machine work: some basic configuration (i.e. how many rotation steps make one mm of movement) as well as bypassing some code (i do not have end stop switches on the machine, for example).

After all this, the machine works, as said. But just barely. I can already see dozens of improvements that need to be made in order for this to be an actually usable tool instead of just a fun curiosity project. And as far as i've read other people's experiences, that is an endless road. Many people end up doing multiple machines, each better, more capable and accurate than the earlier ones. I wouldn't be too surprised if that happened to me too...