Author Archives: Inara Pey

About Inara Pey

Eclectic virtual world blogger with a focus on Second Life. My blog can be found below and I'm semi-active on Twitter and Plurk.

Viewer release summaries: week 47

Updates for the week ending: Sunday November 23rd, 2014

This summary is published every Monday, and is a list of SL viewer / client releases (official and TPV) made during the previous week. When reading it, please note:

  • It is based on my Current Viewer Releases Page, a list of all Second Life viewers and clients that are in popular use (and of which I am aware), and which are recognised as adhering to the TPV Policy. This page includes comprehensive links to download pages, blog notes, release notes, etc., as well as links to any / all reviews of specific viewers / clients made within this blog
  • By its nature, this summary presented here will always be in arrears, please refer to the Current Viewer Release Page for more up-to-date information

Official LL Viewers

  • Current Release version:  3.7.19.296094, unchanged (release notes)
  • Release channel cohorts (See my notes on manually installing RC viewer versions if you wish to install any release candidate(s) yourself):
    • HTTP Pipeline RC viewer version 3.7.21.296736 released on November 17th – reduced pipelined texture and mesh fetching timeout so that stalled connections fail quickly allowing earlier retry. Timeout value changed from 150 seconds to 60 seconds (download and release notes)
  • Project viewers:
    • Viewer-managed Marketplace project viewer version 3.7.21.296858 released on November 21st – allows Merchants to manage inventory associated with Marketplace Listings from within the viewer + sale of items which Merchants do not have the right copy will now be supported with the Direct Delivery purchase mechanism (download and release notes)

LL Viewer Resources

Third-party Viewers

V3-style

  • Kokua OpenSim version updated to version -3.7.81.33408 on November 18th – core updates: GPU benchmark update from LL; additional digit in version number to indicate when features have been cherry-picked (release notes)

V1-style

  • Cool VL viewer updated – Stable branch to version 1.26.12.24 and legacy branch to 1.26.8.82 – both on November 22nd (release notes)

Mobile / Other Clients

  • Mobile Grid Client updated to version 1.22.1226 on November 20th – core updates: support for Android 5.0 Lollipop; experimental /sit and /stand chat commands (change log)

Additional TPV Resources

Related Links

Magic Leap: bringing augmented reality to film in 2015

The Age of Starlight Promotion picture

Magic Leap technology is to be “premiered” at a UK festival in 2015, in a special film / show entitled The Age of Starlight (image: Manchester International Festival)

Professor Brian Cox may not be a familiar name to everyone, but in the UK and for those with an eye for science on television, he has become something of England’s answer to Neil deGrasse Tyson.

Professor Brian Cox

Professor Brian Cox

Cox, who played keyboards in the pop group D:Ream whilst studying physics at the University of Manchester in the 1990s, started his television career in 2005, appearing on the BBC’s science and philosophy series, Horizon.

Since then, he has fronted a range of science programmes and series, as well as appearing on chats shows on both sides of the Atlantic. He’s even  had a guest starring role in the adventures of the very master of time and space itself, Doctor Who.

Now, the BBC reports, he will be presenting in a cutting edge show / film (which he is also scripting) entitled The Age of Starlight, telling the story of the universe, intended to be one of the focal events of the 2015 Manchester International Festival. The production will also feature visual effects by Framestore, the team that won an Oscar for their work on the 2013 George Clooney / Sandra Bullock sci-fi vehicle Gravity, and will be directed by Kevin MacDonald whose films include the Oscar-winning Last King of Scotland and One Day in September and the BAFTA-winning Touching the Void.

But what makes The Age of Starlight particularly interesting is that it will utilise augment reality technology being developed by Magic Leap, the company that hit the headlines in October 2014, when it received $542 million in funding from a broad range of investors.

For those of you who missed it, Magic Leap is the company behind a headset that uses augmented reality to combine realistic computer graphics with everything the wearer sees in real time, in what the company calls “cinematic reality”. The results can be startling, going on the available promotional material: tiny elephants in the palms of your hands, dragons flying among flocks of birds,  yellow submarines sailing through streets, humpback whales floating over crowded beaches, and more.

One of the Magic Leap promotional images: a yellow submarine apparently floats down a street the Magic Leap wearer is walking along

Magic Leap merges realistic computer graphics with everything the user sees in the real world, in what the company calls “cinematic reality”.

However, beyond the stunning promotional images and video, the company has publicly revealed very little about what it is up to. But what they have shown behind closed doors has been enough to get John Markoff from the New York times very excited, and has been sufficient to get Google to lead that US$542 million (£346 million) round of investment in October, which itself came on top of an initial $50 million of funding earlier in 2014.

Given all the apparent mystery surrounding Magic Leap, Sean Hollister over at Gizmodo, decided to spend a little time digging around trying to find out more on what Magic Leap is all about.

In his article, Hollister starts out by framing something of the company’s history, revealing that Magic Leap has been chipping away at things for quite a while. In a fascinating track through the company’s history, he references their 2011 collaboration with Weta Workshop on something called The Hour Blue, as reported by Dice (see the video, below). This still appears to be around today, although exactly what it is, isn’t clear. This collaboration may have been the reason why Weta’s co-founder, Richard Taylor, opted to make a personal investment in Magic Leap during the $50 million round of funding and now sits on the board of directors.

Making augmented reality of the kind Magic Leap is trying to achieve is a significant challenge, as Hollister explains:

If you’re looking at the real world, your eyes are focusing at a variety of different distances, not necessarily on a tiny piece of glass right in front of your face. The real world also reflects a lot of light into your eyes, which is why the images from heads-up displays like Google Glass appear transparent and ghostly. Because you need to see the real world, you obviously can’t have a projector covering the front of the glasses: that light has to be bounced in from the side, which generally results in a narrow field of view.

And of course, you need some way to track your head and your surroundings so that CG objects appear to occupy a real place in the world, instead of looking like a flat image— which, sadly, is how many existing augmented reality specs do it.

Given this, Hollister reasoned, the best way to understand what the company might actually be developing is to take a look at the patents they have filed and which address such challenges. In taking this line, he’s actually following the lead set by Tom Simonite, a bureau chief at MIT Technology Review.

Continue reading

Monty Linden discusses CDN and HTTP

Monty Linden talking CDN and HTTP

Monty Linden talking CDN and HTTP

In show #46 of The Drax Files Radio Hour, which I’ve reviewed here, Draxtor pays a visit to the Lab’s head office in Battery Street, San Francisco. While there, he interviews a number of Linden staffers – including Monty Linden.

Monty is the man behind the Herculean efforts in expanding and improving the Lab’s use of HTTP in support of delivering SL to users, and which most recently resulted in the arrival of the HTTP Pipeline viewer (the code for which is currently being updated).

He’s also been bringing us much of the news about the content delivery network (CDN) project, through his blog posts; as such, he’s perhaps the perfect person to provide further insight into the ins and outs of the Lab’s use of both the CDN and HTTP in non-technical terms.

While most of us have a broad understanding of the CDN (which is now in use across the entire grid), Monty provides some great insights and explanations that I thought it worthwhile pulling his conversation with Drax out of the podcast and devoting a blog post on it.


Monty Linden talks CDN and HTTP with Draxtor Despres on the Drax Files Radio Hour

Monty starts out by providing a nice, non-technical summary of the CDN (which, as I’ve previously noted, is a third–party service operated by Highwinds). In paraphrase, this is to get essential data about the content in any region as close as possible to SL users by replicating it as many different locations around the world as is possible; then by assorted network trickery, ensure that data can be delivered to users’ viewers from the location that is closest to them, rather than having to come all the way from the Lab’s servers. All of which should result in much better SL performance.

“Performance” in this case isn’t just a case of how fast data can be downloaded to the viewer when it is needed. As Monty explains, in the past, simulation data, asset management data, and a lot of other essential information ran through the simulator host servers. All of that adds up to a lot of information the simulator host had to deliver to  every user connected to a region.

The CDN means that a lot of that data is now pivoted away from the simulator host, as it is now supplied by the CDN’s servers. The frees-up capacity on the simulator host for handling other tasks (an example being that of region crossings), leading to additional performance improvements across the grid.

LL's CDN provider (Highwinds) has a dedicated network and 25 data centres around the world which should help to generate improvements in the speed and reliablity of asset downloads to your viewer, starting with mesh and textures

LL’s CDN provider (Highwinds) has 25 data centres around the world and a dedicated network from and through which essential region data on textures and meshes (at present) can be delivered to SL users

An important point to grasp with the CDN is that it is used for what the Lab refers to as “hot” data. That is, the data required to render the world around you and other users. “Cold” data, such as the contents of your inventory, isn’t handled by the CDN. There’s no need, given it is inside your inventory and not visible to you or anyone else (although objects you rez and leave visible on your parcel or region for anyone to see will have “hot” data (e.g. texture data) associated with it, which will gradually be replicated to the CDN as people see it).

The way the system works is that when you log-in or teleport to a region, the viewer makes an initial request for information on the region from the simulator itself. This is referred to as the scene description information, which allows the viewer to know what’s in the region and start basic rendering.

This information also allows the viewer to request the actual detailed data on the textures and meshes in the region, and it is this data which is now obtained directly from the CDN. If the information isn’t already stored by the CDN server, it makes a request for the information from the Lab’s asset servers, and it becomes “hot” data stored by the CDN. Thus, what is actually stored on the CDN servers is defined entirely by users as they travel around the grid.

The CDN is used to deliver "hot" texture and mesh data - the data relating in in-world objects - to the viewer on request

The CDN is used to deliver “hot” texture and mesh data – the data relating in in-world objects – to the viewer on request

The HTTP work itself is entirely separate to the CDN work (the latter was introduced by the Lab’s systems engineering group while Monty, as noted in my HTTP updates, has been working on HTTP for almost two-and-a-half years now). However, they are complimentary; the HTTP work was initially aimed at making both communications between the viewer and the simulator hosts a lot more reliable, and in trying to pivot some of the data delivery between simulator and viewer away from the more rate-limited UDP protocol.

As Monty admits in the second half of the interview, there have been some teething problems, particularly in when using the CDN alongside his own HTTP updates in the viewer. This is being worked on, and some recent updates to the viewer code have just made it into a release candidate viewer. In discussing these, Monty is confident they will yield positive benefits, noting that in tests with users in the UK,, the results were so good, “were I to take those users and put them in out data centre in Phoenix and let them plug into the rack where their simulator host was running, the number would not be better.”

So fingers crossed on this as the code sees wider use!

In terms of future improvements / updates, as Monty notes, the CDN is a major milestone, something many in the Lab have wanted to implement for a long while,  so the aim for the moment is making sure that everyone is getting the fullest possible benefit from it. In the future, as Oz linden has indicated in various User Group meetings, it is likely that further asset-related data will be moved across to the CDN where it makes sense for the Lab to do this.

This is a great conversation, and if use of the CDN has been confusing you at all, I thoroughly recommend it; Monty does a superb job of explaining things in clear, non-technical terms.

The Drax Files Radio Hour: land of the Lindens

radio-hourTime has meant I’ve not had time to mull over the last few Radio Hour podcasts, which is a shame as there have been some gems. If you’ve not already done so, do try to catch show #41 for a brilliant interview with Justin Esparza the man behind one of the great legends of SL – Salazar Jack. Then there’s show #44 with Jaimy Hancroft, one of the great talents behind Dwarfins and the creator of the magnificent Hope’s Horizon at the 2014 Fantasy Faire.

However in the latest podcast, show #46, Drax ventures out on his own to visit the Lindens on their home turf, dropping in on the Battery Street offices for an informative visit, offering a lot to listen to and absorb.

The Lab's Battery Street staff (image: Ebbe Altberg, via Twitter)

The Lab’s Battery Street staff (image: Ebbe Altberg, via Twitter)

The first big interview, kicking off at the 18:08 mark into the show, is with Monty Linden, who provides a clear-cut explanation for the Content Delivery Network (CDN) and also talks about his HTTP project work. Such is the level of information in this conversation, rather than condensing into a couple of paragraphs here, I’ve included it in a separate article, as it really does help frame both the CDN work and the HTTP work in non-technical terms.

That said, Drax also leads Monty into a discussion about net neutrality starting at the 24:50 minute mark in the interview (and continues through until the 30:13 mark), which is also something worth listening to in detail (and which I’ve deliberately excluded from the article on Monty’s CDN / HTTP discussion).

Down in the basement – looking down on the Lab’s engineering team at Battery Street (image via The Drax Files Radio Hour)

Elsewhere in the show, Drax gets to try out the DK2 with Second Life (36:27), with Ebbe revealing that a popular destination when demonstrating the Oculus and SL to journalists is Mont Saint Michel, which for those who have not visited it, is a glorious Second Life reproduction of the “real thing“. Ebbe also makes mention of one of the problems that preclude SL from being an “ideal” companion for the Oculus – the render engine isn’t up to consistently manage the 90 frames-per-second already utilised by the Oculus Crescent Bay prototype in order to eliminate issues of image judder when the wearer turns their head.

In discussing the Oculus Rift, Ebbe indicates that the Lab is working to make the abstraction layer for input devices as open as possible on their next generation platform, so that new devices can be added as easily as possible. He also reveals the new platform already works with game pad devices and the Leap Motion.

The discussion of the Oculus and Leap Motion is particularly interesting as it opens the door on the myriad of challenges encountered in user interface design. For example, with gesture devices, not only do you need to define the gestures required to move an avatar and interact with in-world objects, etc., you need to consider what’s required in order for the user to interact with the UI itself – to press buttons, make menu selections, and so on. These complexities of user interface design get even deeper when you consider that not only do they have to work across multiple client platforms, they have to work across multiple combination of client platform, input and other devices (screens, headsets, etc.).

Mont Saint Michel; Inara Pey, June 2013, on FlickrMont Saint Michel – a location the Lab uses to demonstrate the Oculus Rift and Second Life to journalists

Mention here is also made of High Fidelity. While the two are entirely separate companies, there is an intimation from Ebbe that High Fidelity may be one of the “technology partners” the Lab is talking to with regards to facial recognition capabilities in the next gen platform. Given that the Lab did provide some seed money towards High Fidelity’s first round of funding, this would make some sense.

As Drax tours the Lab’s office with Ebbe (35:13), some interesting snippets of what is going on are provided – such as the work that’s already going on with the “next generation Marketplace”. This is further touched-upon in a conversation (43:59) with Brooke Linden from the SL Commerce Team. She not only discusses aspects of the the Marketplace such as trying to address performance issues, improve search and so on, she also confirms that the Commerce Team is working closely with those working on the next generation platform to ensure that lessons learned in operating the SL Marketplace are carried forward in support of that project.

A potentially interesting snippet about the SL Marketplace from the conversation is that it handles a larger volume of sales than most on-line e-commerce sites. As Brooke points out, given that it does deal with micro-transactions, it is somewhat easier for the Marketplace to generate volume sales; however, this still makes it a challenge when trying to manage things.

Kona, Shaman and (looking like he's fresh from the set of Star Trek sans insignia!) Caleb Linden

Left-to-right: Shaman and Kona Linden from the QA  team and (looking like he’s fresh from the set of Star Trek sans insignia!) Caleb Linden. Shaman (one of the friendliest and welcoming members of the Linden team I’ve met in-world) and Kona discuss with Drax the idea of making Lab’s internal merchandise, such as the Rubik’s cube Shaman is holding, available to users, as well as matters of community (both within the Lab and in SL). Caleb co-leads the Server Beta User Group meeting on Thursdays (image via The Drax Files Radio Hour)

One interview that didn’t make it to the podcast features Jeff “Bagman Linden” Peterson, the Lab’s VP of engineering, who is heading-up the next generation platform work (Don “Danger Linden” Labs having the lead on Second Life). Apparently, a little too much was revealed about the new platform considering the growing commercial interest in virtual world spaces, so the Lab has requested that  Unfortunately, dues to the fact the Lab is keeping a tight lid on the new platform for the time being, the interview has been shelved for (hopefully) a later date.

All told, a really interesting podcast, one that shouldn’t be missed.