High Fidelity have issues a progress report for the second quarter of 2015, which has been circulated to users via e-mail and made available as a blog post.
In the report, they highlight recently achievements / work, including:
- The fact that they’ve been hiring-in new talent (and are still looking for more). It should be noted that the talent is restricted to employees, either. At the end of May, Professor Jeremy Bailenson of the Virtual Human Interaction Lab at Stanford University and Professor Ken Perlin both joined High Fidelity’s growing list of high-powered advisors
- The instructions and video on setting-up the stack manager to run your own High Fidelity server has been updated, with the promise that next up will be an ability to optionally allow you share your server resources with other nearby users who need extra capacity
- The ability to track and capture head movements and facial expressions with a regular webcam, as an alternative to needing a 3D camera
- The arrival of the High Fidelity Marketplace, where you can drag and drop content into your server, and also to upload content you want to share with others. This is currently a sharing environment rather than a commerce environment, but the promise is that the commerce aspect will be coming soon
- Commencing work on implementing distributed physics, building on top of the open source Bullet physics engine, with the aim of having low latency for interactions while maintaining the same state among participants – such as when people in different locations are playing Jenga or billiards together
- The ability to import web content into High Fidelity – static web pages, videos, interactive web pages, etc., complete with a demonstration video and the promise of figuring out the best ways to allow the different types of shared browsing that people are going to need
- My personal favourite: zone entities, skyboxes and dynamic lighting with spherical harmonic lighting and optional sync to real-world day/night cycles
Also in the Next Steps aspects of High Fidelity’s development is the intriguing promise of avatars with soft bodies, which are capable of interacting physically, or as Philip Rosedale puts it in the blog post, “imagine sword-fighting, for example”, while being driven by hand controllers such as those coming with the HTC / Valve Vive or for the Oculus Rift. This also links back to the work going on with the physics engine as well, which has, as Mr. Rosedale explains in the blog post, an added level of complexity within High Fidelity due to the distributed nature of the platform, and the need to maintain consistency between players as to what is happening, where things are, who is controlling what, and so on.
For those wishing to keep abreast with the key points of what is going on with High Fidelity, but who do not necessarily have the time to jump into every blog post that comes out, these updates are a useful means of tracking core events within the platform.