Before I gush praise on the way it was able to get those fine patterns on the flying banshee's of Avatar, the main problem with Mari is that you probably need a system on the level of a company like Weta to actually do anything on it. Apparently, version 1.2 of Mari simply refuses to work with systems not up to it's requirements. My computer, which has an intermediate NVidia Quadro FX1800, began to splutter handling scenes that merely had been worked on for a few 'layers'.
But using Mari you begin to realise why it needs such raw grunt. High resolution texture painting on 3D models was insanely quick, and the tools are Photoshop-like and easy to understand. Although painting by hand is nowhere near the dexterity or pure Photoshop work, for me the way you can photo-source direct to your model is where it really shines. There are options to mask, colour grade, tile your textures right in the software and simply 'roll' it onto your UV maps. Once on, you can get out some advanced warp, clone and smudge tools to knock it into line - all within your viewport.
Although abit clunky at the moment, the workflow from colour map, to bump, to spec is easily facilitated with all the adjustments you find in Photoshop. There are (very) basic shaders to be able to combine these, light them and preview them in the viewport. You can drop the opacities, or change the layer modes etc. It's all there, and despite this being a new software, everything seems to be incredibly familiar. Even navigation is Maya based.
Arguably, Mari's rival products would be Maxon's Bodypaint and to a lesser extent, Zbrush/Mudbox. Admittedly I have limited experience in Bodypaint. I tried it a few years back and gave up at navigation. Zbrush on the other hand has some powerful tools to texture. Ideas such as ZApplink were a god send and now integration with other packages is seamless. Mari seems to be alone right now. It just takes in textures and spits out textures. And unlike Zbrush, Mari is purely a 2D projection painter- you can't paint in 3D. This is so textures can remain ultra crisp, presumably when it hits the big screen. But in saying that I struggled to come to grips where or what I was painting without actually seeing what was happening with my model. At the moment displacement maps can't be viewed, although this is changing.
And that in the end is what really impressed me. The Foundry really want to improve Mari and get people to use it and receive feedback. I sent an innocuous tweet about it and suddenly there's an email from a representative asking for feedback and whether they could help. And the fact that larger studios are adopting it too bodes well about where it's heading.
So in regards to the actual art piece, the base mesh was NOT done by me. This was provided by the great Oliver Marc Plouffe specifically for a texturing comp run by the Foundry. These renders were done in Maya Mental ray. Maps including colour, bump, spec etc all done in Mari. Displacements in Zbrush. The fine details you can see on the goggles were photo-sourced largely and surprisingly quick to do. Hope I can finish the whole figure eventually.