I tend to keep an eye on the High Fidelity blog as and when I have the time (I’m currently waiting to see if I get into the next phase of alpha testing, as I’ve so far failed to build the client (I sucketh at tech
sometimes), so try to keep up with developments. I also confess to hoping for another video from AKA…). This being the case, it was interesting to get a look behind the doors at what has been going on within High Fidelity courtesy of self-proclaimed “bouncer”, Dan Hope.
Dan’s blog post turns the spotlight away from the work of the core High Fidelity team and focuses it on those alpha testers / builders who have built the client, made the connection and have started poking at various aspects of the platform and the worklist.
Austin Tate is a name well-known within OpenSim and Second Life. His c.v. is quite stellar, and includes him being the Director of the Artificial Intelligence Applications Institute (AIAI) and a Professor of Knowledge-Based Systems at the University of Edinburgh. Austin’s work has encompassed AI, AI planning and the development of collaborative workspaces using virtual environments and tools – particularly the I-Room.
Within High Fidelity, where he is known as Ai_Austin, he’s been extending the work on I-Rooms and collaborative spaces (both of which seem to have an ideal “fit” with High Fidelity) and has been working on 3D modelling, with Dan noting:
You might have figured out by now that 3D worlds are no good if they can’t handle 3D models accurately, which is why Ai_Austin also tests mesh handling for complex 3D objects. The image above shows the “SuperCar” mesh, which has 575,000 vertices and 200,000 faces, being tested in HiFi. There are several other meshes he uses, too, including one of the International Space Station that was provided by NASA.
SuperCar has also featured in Austin’s work within SL and OpenSim, where he has been providing invaluable insight into working with the Oculus Rift, the development of support for it within the viewer, using it with other hardware (such as the Space Navigator). In fact, if you have any interest at all in the areas of AI, virtual world workspaces, VR / VW integration, etc., then I cannot recommend Austin’s blog highly enough (We also share a passion for astronomy / space exploration and (I suspect) for racing cars, but that’s something else entirely!).
Ctrlaltdavid might also be a name familiar to many in SL and OpenSim, being the HiFi name of Dave Rowe (Strachan OFarrel in SL), the man behind the CtrlAltStudio viewer which focuses on adding OpenGL stereoscopic 3D and Oculus Rift support to the viewer.
With High Fidelity, he’s working on Leap Motion integration, to provide a higher degree of control over an avatar’s hands and fingers than can be achieved through the use of other tools, such as a the Razer Hydra. The aim here is to increase the sense of immersion for users without necessarily relying on clunky hand-held devices. As we know, the Leap Motion sits on the desk and leaves the hands free to gesture, point, etc., and thus would seem and ideal companion when accessing a virtual environment like HiFi (or SL) when using a VR headset; or even without the headset if one wishes to have a degree of liberation from the keyboard.
Opening this look at the work of various alpha testers / builders, Dan notes:
We can’t create a truly open system without making it compatible with other open-source tools, which is why Judas has been creating a workflow that will allow artists to make 3D models in the open source program Blender using HiFi’s native FBX format.
This forms a useful introduction to the work of Judas, who has been involved in bringing High Fidelity and Blender closer together in terms of providing improved FBX support for the platform, which is now bearing fruit. “Only last week something was added in that allowed me to import the HiFi avatars into Blender without destroying the rigs we need to animate them,” Judas is quoted as saying in the blog post.