Well, the past two weeks have been all about the tech and finding out just what makes avatars tick...
Kyle G from the Reaction Grid started up http://indiemetaverse.ning.com/ with the idea of developing a meeting place for users/developers of all virtual worlds to gather, share ideas and technical know how with each other with a final aim of meeting up in the US in 2010. There's just under 90 members already and I thought i'd contribute some of findings i'd come across.
My initial aim was to try and cross over Avatars from SL/Opensim to Unity 3D and create a base system that developers could import into their worlds, I made up a few tutorials going through the process:
The really interesting point of this was that I was able to import the standard avatar along with bones and all animations into a Unity project but from within the project I was able to control avatar bones/animations using some simple scripts, this got me to thinking about how easy it would be to setup a system whereby the users could create their own custom animations in world and why haven't LL or anyone else done this yet?
For one the saving of user data in world seems to prevented for a reason, you can't write to Notecards etc but also it would remove another revenue stream. The more I played the more I could see where the SL designs came about, I was able to create my own textures in world by painting straight onto my avatar, why is this not available? another revenue stream I guess.
The possibility's that this could open up are going to be stunning and it's great to be able to de-construct this kind of work and see just how it's made, it's something I often do when looking at a new build in world, I like to work out the techniques for myself and learn in the process.
Considering the way all these areas are constructed so that to create objects everyone has to make a series of micro payments to upload texture/anims/sculpts etc it will be interesting to see what happens when Mesh imports become available on the SL grid and just whether the Lab is considering increasing the upload charges for this new advancement?
My ultimate aim for this is to create a more detailed mesh and bone system so facial expressions and hand/finger movements can be captured, I really want to create a new system of communication in VW's, perhaps a system with a very limited gui and the ability to control you av's actions via mouse expressions. With most of this stuff though it's just something I pick at when I have the time, I like to get an idea and try and implement it, just to learn how it's done. :)