Best wishes for everyone! :)
The start of my active job seeking efforts is marked by the creation of my Twitter account. You can also find me on Google+ or connect through LinkedIn. I’m usually present in the #techart and the #UDKC IRC channels.
UPDATE: Things seem to be fine again.
So far only older tutorials seem to be affected but I haven’t checked everything yet. I’m working on it, hopefully can fix (hack) everything in the next 48 hours.
I’m also cleaning up my domains and folder structure so certain links might be broken but I’ll fix them soon.
UPDATE: The site was moved successfully and everything seems to be in order. Yay!
It’s been a while since the last post… I’m currently working on Gavit, writing a longish MetaSL related tutorial and another one for a FilterForge filter I’ve just finished. I hope I’ll be able to post something a bit more interesting in the next few weeks.
Back in 1998 I saw an ad in a local PC magazine: it was about a gamepad which registered tilting. I was awestruck by the simple but great idea and a few weeks later I bought a Sidewinder Freestyle Pro. The following years I played Midtown Madness and Freespace 2 with it, many, many hours, and I’m a fan of motion controls ever since.
I’m using this treasure chest from Gavit as the example asset in an upcoming tutorial about realtime, high definition texturing and baking:
All the work is done in Max 2012′s viewport with instant feedback. Well except when it comes to displacement mapping which still needs a MentalRay render (last image).
I kept testing Photofly to see what it can do. The biggest project so far was this dead tree:
100 images, 660 Mb of data, 46 manual reference points. The resulting mesh has 800K polygons, a 4K and a 2K textures.
The photo session took about 15 minutes. I grew restless towards the end and started to sample the surface more sparsely. That caused tracking problems and low resolution textures in certain areas. Had I spent the time to take even more images, for another 15 minutes, I could have saved the 2 hours of manual stitching.
The mesh is rather smooth and unnecessarily dense so I think the poly count could be halved without problems.
I’ve just tried Autodesk’s Photofly, a photo based 3d modeling service and the results are pretty good.
My test project was a small rock, roughly 4 x 4 x 3 cm (1.5 x 1.5 x 1.2 inches) in size. I took 57 images, 14 Mp (4000×3000) each, under almost perfect diffuse lighting conditions. The processing of the pictures from file selection to the final, high resolution mesh took about 15 minutes. This includes uploading stuff to the cloud and getting a draft mesh as an intermediate step.
I learned that there are a few things one should keep in mind before starting taking pictures:
- Shiny surfaces confuse the feature detection algorithms, matte materials work much better.
- Diffuse lighting is highly recommended because it helps eliminating specular highlights and also prevents sharp shadows getting “baked” into the diffuse texture.
- Depth of field can be an issue, especially when taking photos of small objects in macro mode.
- Easily distinguishable, single color background seems to be important.
I plan on scanning different rock, wood, dirt, etc surfaces an building a library of diffuse/bump map pairs which could be used directly as textures of brushes to paint onto meshes.
It’s weirdly satisfying to kick a partially deflated balloon.