Aug 30, 2012

Digital Evolution

An initial Metaverse Blueprint. Beyond #SecondLife

 

Today’s post adds something of interest to the current discussion concerning the Metaverse; however instead of just going on about whether or not what we already have constitutes a Metaverse, or debating whether or not we’re looking at existing systems today as a possible path to that Metaverse, I’d like to offer a very different approach.

 


 

dys_vaccine0000

 

 


 

What I’ll be writing today circumvents that particular set of topics and gets right to the root of the matter by addressing what a Metaverse actually will accomplish. Call it a rough draft or a blueprint, these are the things we should be focusing on today if we ever want to realize that dream which is the Metaverse.

 

Bottom Up

 

Straight out of the gate, I will say that our approach thus far has been lackluster. This isn’t anyone’s fault in particular, we simply have our priorities askew. I applaud =ICAURUS= for taking some initial steps to go about this, but immediately I’m throwing the flag out onto the playing field for a penalty.

 

Addressing whether or not the next system has the appropriate rendering engine is a top down approach. We’re talking about the what instead of the how. This is a dangerous misstep and is usually the first mistake these endeavors make when attempting to build the Metaverse. For instance, let’s look at Linden Lab. It was built in the beginning under an assumption that they never really expected it to be as popular as it was, and so that philosophy dictated some decisions up front which came back to haunt them later on in the lifecycle.

 

Such thinking in the beginning usually leads down the road to situations where we’re talking about patching an outdated system. So let’s think about this from the bottom up instead of trying to build the skyscraper starting from the 100th floor and working our way to the basement.

 

Root Prim

 

To begin, let’s ask a very basic question -

 

What can the Metaverse actually do; or more importantly what is it actually accomplishing?

 

The simple answer to this is that the Metaverse is a Spatial Representation of Data, wherein the ecosystem supports many modes of interaction from local nodes (single user) to multi-node (many users) within a contiguous space.

 

The next question becomes -

 

What data can this system represent?

 

In order to answer this we need to start with an existing context. We could very well just invent new types of data, but it’s far simpler to begin with existing forms of data and go from there. So as a basis, let’s say that the most important foundation for a forward looking Metaverse client is that it begins as a functional Web Browser.

 

This becomes our metaphorical ground level in the skyscraper. From here we build upon that foundation to the top of the skybox, but for now we need to employ the KISS method (Keep it Simple, Stupid)

 

The Metaverse should be able to handle the basic standards that an existing Web Browser today can handle, and do so natively. Remember, we’re building an HTML5 compliant Web Browser first (and a damned good one). I cannot stress that last point enough, because the built in web browser for Second Life is horrible. In this proper context, the web browser becomes half of the integrated experience, and so you can’t afford to screw it up or treat it like an afterthought.

 

Ok, so why are we doing that instead of just building the 3D Metaverse up front?

 

It’s safe to assume that the foundation is a Web Browser simply because it is natively a 2D context of the same data that we would like to represent Spatially.

 

Now we start asking the important questions.

 

Let’s say we now have our HTML5 complaint web browser. What does it support overall? It can handle FTP, HTTP, HTTPS, image rendering, audio playback, video playback, animated images (GIF/APNG, MNG) and of course this wonderful thing called Add-Ons and Plugins.

 

As of this moment, an HTML5 compliant web browser can actually excel better at the obvious stuff than the best of our Virtual World clients. Now that we can acknowledge that, I think we’re in a better position to remedy this issue up front.

 

Modes of Operation

 

I’ve gone over this in a rudimentary fashion within the confines of the Second Life JIRA, but there is far more context than I let on for why I submitted it as a feature request. VWR-22977 

Built-In Web Browser Uses New Canvas Rendering Layer [Not New Floater Window] is a testament to just how far in advance I was thinking before submitting things. To this day it hasn’t been reviewed, nor did I actually ever expect that it would be. All I really wanted was to make certain it was on public display.

 

So here’s the bigger picture -

 

When you’re building a new Metaverse client from scratch, your first view looks something like this:

 

 

Viewer 2 New Layout (Web View) VWR-22977

 

 

It doesn’t look exactly like that, but it’s the best reference image I’m going to provide at this time. The most important aspect is in how our modes of operation behave, which can be simplified to what changes I’ve made to the top navigation bar -

 

 

Viewer 2 New Layout (Web View) VWR-22977.png (PNG Image, 1600 × 860 pixels)_1346327040024

 

 

Viewer 2 New Layout (3D View) VWR-22977.png (PNG Image, 1600 × 860 pixels)_1346327167868

 

 

The most noticeable thing about the change is two-fold: First, and foremost, the client acts like a native web browser wherein the rendering canvas uses the entire space which is now reserved for the 3D Rendering Canvas. In effect, we’re simply using two rendering canvases wherein only one is in view at any given moment while the other one is paused.

 

On the far left is a new type of button which does not exist on a web browser today, despite the other buttons being common to both a virtual environment viewer and a web browser (back, forward, stop, home – and I’d like to state now a reload button for area rebake)

 

So here’s the deal… this simple foundational change in the way things are organized from the get-go is enough to fundamentally change how we perceive the Metaverse and (interestingly) the entire existing Web.

 

Secondly, what this change implies is far greater than what was explicitly mentioned in the JIRA that I filed, but anyone who is savvy to give this much thought begins to see the implications this would have.

 

For instance, the default for a web server is index.htm or index.html and that’s how we know we’re at the “home page” location on a system. With a Multi-Mode Metaverse client, we gain something from this operational change through what can be said is index.vrtp or whatever extension you’d like to call it.

 

It’s your answer for universal Hypergrid teleportation. When a web address becomes capable of serving as a Metaverse teleport (representing a spatial location within the Metaverse) you’ve just opened up a Pandora’s box of opportunity. Instead of remembering a long SLURL, or HyperGrid teleport string, we can embed that as Metadata in a type of XML format on the root of a website along with the index.htm

 

That Metaverse Index File is seen by the client as a location with further XML data attached (like owner, description information, etc) and we can convey that in the client via a notification. When you visit that website with a standard web browser, you just get the website. But when you visit that same website (Say Google.com) the Metaverse client sees Index.htm/html for the Web Browser portion of the client (consider this your dynamic brochure for a location), but it also looks for Index.VRTP (Virtual Reality Teleport Protocol).

 

In this instance, let’s say you’re browsing the Web with your new Metaverse Client. You type in MetaverseTribune.com and your HTML5 complaint web browser loads it up just like you would expect. Now, let’s say they had an index.vrtp on the root directory as well as the index.htm?

 

You’d get an unobtrusive notification saying that the page you are looking at has a location, and you may click to go there in the Metaverse. Or you may turn that notification off entirely and simply use MetaverseTribune.com in the 3D Address Bar instead of an SLURL/Hypergrid Teleport. If they have the index.vrtp on that server, you’ll get teleported to the location, and (as a really cool side effect) when you arrive, the location can have a website automatically loaded in the web browser portion as part of the location.

 

How about just an icon in the address bar that denotes the website has an available location in 3D and by clicking that icon in the address bar you will get the teleport? The same could go for visiting a location that has a website attached to it – an icon appears in the address bar that when clicked opens the homepage as set by the location owner.

 

 

Legacy of Advancement

 

This is just one reason to start with our web browser context first and translate it to the virtual world, but how about other contexts? How does our Metaverse client handle a FTP connection?

 

This is why we’re basing things first by building a Web Browser, because then we’ll start translating how the web browser handles existing standards and protocols into our spatial environment, all while not drastically changing the initial Web mode of operation. This way, if the new user can use the web browser, they will be right at home with the Metaverse view.

 

Back to the FTP connection… how does our Metaverse client handle that context?

 

This is why we’re framing this as multi-node and local-node operation. In an FTP scenario, it would dynamically generate the environment to represent the files (as the objects within the environment) and folders as rooms by which you can walk around in. This also works for parsing local directories (just in case you felt like walking around your Hard Drive).

 

Now we’re asking the obvious “Why the hell would we want to do this?”

 

Because I should also be able to attach hyperlinks to objects in the 3D space, and if that hyperlink is an FTP mode, then it is treated like a portal into a local-node space. I’m not breaking the metaphor of interaction and this is the most important part of immersion.

 

In an FTP or Local-Node context (your hard drive) the owner can set a similar VRTP descriptor XML in the root which defines whether or not it is a singular node (not multi-user; as in, I and many others could see it but not each other) or multi-user (multi-user space). In the context of the web, that’s a hell of a lot of people suddenly traversing dynamic spaces online (potentially 6 billion virtual spaces), and right now we shouldn’t have to worry about everybody running a special server to handle it. See the basic references at the end for the reason why.

 

This is why the architectural foundations of this are far more important than what rendering engine we’re using. While the rendering engine is important, it pales in comparison to the “how this thing operates” versus the “what it looks like” eye-candy portion.

 

In the structural portion, we’re looking at a hybrid decentralized system of operation. In the dynamic modes, we’re connecting to each other in a peer to peer fashion. So you wouldn’t necessarily have to be running a special server to have an environment. A standard Web Server right now becomes a potential Metaverse Space just by adding those XML descriptors, or using in-world Hyperlinks to those dynamic spaces.

 

What about using media from existing servers online within the virtual world? Instead of uploading a file to an asset server, maybe you already have the media on a server of your own? Why not have the ability to state a web address as the file location?

 

Now we’re talking native context for existing MIME types, which already have the storage part down pat. Speaking of which – wouldn’t it be great to natively open PDF files, Images, Audio, Videos, and more in the Metaverse? Again, the web browser does this without even thinking twice… but the current generation of virtual world viewers simply.. well, they don’t address this very well if at all.

 

Hopefully you should see why starting with an HTML5 web browser as our foundation makes sense?

 

How does the Metaverse translate in 3D what a Web Browser handles effortless in 2D?

 

Using the Web Browser as our Metaverse Checklist.

 

 

After we are comfortable handling the translation of existing data into our Metaverse context, we can move on to Metaverse specific contexts which need to be addressed.

 

For instance, a universal passport/avatar. Translation of Metaverse currencies via Exchange Rates. A Metaverse Location Crawler to work as a Search Engine. A marketplace system that is built into the client – which actually becomes damned easy when it has a native web browser context. Decentralized Asset Storage systems that are secure. Authentication. Building an SDK and licensing it.

 

The last item on that list is the least obvious. It’s not mandatory but probably a good way to monetize the work required to build such a system. Setting up the spaces requires no license per-se but if you want to build a new product or plugin with/for it then there is a license. Think of it like a Metaverse App Store. Or just monetize a percentage of the apps themselves and make the licensing and SDK free… whatever floats the steampunk airship…

 

There is a lot to solve here, and the rendering engine is probably the least of our worries. As a matter of fact, if this was properly built – then we could substitute any modern graphics engine on top of our foundation and it would work.

 

Plugins and Add-Ons

 

Just like a modern web browser, the Open Metaverse client should support an extensible plugin architecture for add-ons and outright browser plugins. Maybe the Web Browser portion just handles Chrome Extensions natively, but the Metaverse mode has its own add-ons architecture and maybe an SDK for  full blown plugins to extend the viewer capability much further.

 

This concept of building the base system and then allowing a plugin architecture and add-ons is not new. Web browser already do this as a defacto, and even if we were to look back at Cyberpunk culture, (Shadowrun) we had decks where there were slots to load custom “apps” which extended or improved the custom experience.

 

I’m going with the fictional Metaverse concept here, and our current generation of actual Web Browser as proof this is the right approach to our future Metaverse system. Modular and Extensible.

 

 

Open Metaverse Foundation

 

Should be the equivalent of the Mozilla Foundation in regard to the Metaverse. Time to shake things up and become cool again.

 

 

Basic References

 

Let’s say you’re an aspiring coder (or team of coders) who are looking to tackle this next step of the Metaverse… below I’ll list some exceedingly helpful pointers for reference. These links should give you a head start for the foundation aspects:

 

P2P Architecture – Solipsis Decentralized Metaverse. It offers Area of Interest Networking, and a peer to peer method for handling larger amounts of people. When the system starts scaling upward and becoming popular, you’ll be glad this is in your back pocket to load balance against. This is likely the diamond in the rough that would allow the entire pre-existing Internet to be turned into dynamic multi-user spaces – ie: Instant Metaverse. The website is locked down tight and may never return – however I do happen to have a copy of the source code and research paper archived if anyone is interested.

 

Asset Servers – I could suggest something like Owner Free File system. It’s a multi-use blocks storage paradigm (Brightnet) that would allow the budding Metaverse creator to balance existing caches of users against having to centrally store it all. Saves a lot of redundant bandwidth and processing.

 

Feel free to make this a community reference for ideas on where the pieces of the overall puzzle lay for aspiring coders. Add your own references to the comments below.

 

This is only a very rough stream of consciousness post, and shouldn’t be taken as an “entire” proposal. The only thing I wished to convey was the proper starting context and metaphor of interaction so we (as a community) could start off on the right foot.

 

 

 


 

 

 

 

 

 

 

 

 

 

 

24 comments:

  1. Will -
    I agree that we need an HTML 5 Web-based client for OpenSim.

    However, I disagree that the Metavese, is, at its basic, a 3D presentation of data.

    In my opinion, the Internet is a way to communicate data. The Metaverse is a way to communicate experiences.

    There are already tons of ways to embed 3D images and graphics in Websites. This does not make a Metaverse, and why VMRL didn't go far.

    However, if you look at the Metaverse as a way of, say, communicating the experience of being in a classroom, or attending a party, or being a vampire and biting someone on the neck ... now you're facing a totally different set of technical requirements and basic minimums.

    Second Life (and OpenSim) allow users to create experiences and then communicate those experiences to others.

    Navigating a website or posting to Twitter then becomes just another experience that you can embed in a virtual world (which you now can, with web-on-a-prim).

    ReplyDelete
    Replies
    1. 1. Proof that you didn't actually read what was written here. I never said we needed an HTML5 Web based client. Go back and actually re-read it.

      2. Experience is objective at best. The only difference between the experiences is that one is predominantly 2 dimensional while the Metaverse is three dimensional. Other than that, they both share the same heritage and data.

      3. Web on a prim is a piss poor example of web implementation, just like the built-in web browser it self is a similarly half-baked experience compared to the real thing.


      Delete
    2. In that case, you've totally lost me.

      I could understand your post if it was in the context of an HTML 5 client...

      Meanwhile, on the data vs. experience front -- the two are extremely different, and require different form factors.

      Communicating information is primarily visual, occasionally multi-media, sometimes 3D charts or other visualizations. It's a one-to-many relationship. It's asynchronous for the most part, except for one-to-one, or group-based communication channels like Skype or chat rooms. It invites multitasking.

      Communicating experiences is primarily synchronous, and is extremely personal, requires immersion and attention and limits multitasking.

      A client designed to primarily communicate experience -- one focused on immersion, identification with the avatar, creation and development of relationships -- will not be as good as communicating straight information as a client designed just for that.

      Delete
    3. Experience is what the human factor brings into the environment. This is a discussion about two types of spaces - Passive and Asynchronous. Passive meaning that the data itself is parsed into a spatial context, while asynchronous is when that data is parsed but also offering asynchronous interaction directly modifiable by the inhabitants.

      What I am advocating is getting to the root and identifying that the data is the same in either case. The very models that make up the "experience" are just a data type represented spatially. Just like everything in your inventory is just a data type. You can see them in a 2D view as file icons, etc or you can interpret them as part of the spatial Metaverse.

      What is being said here is very plain and has the opposite to do with building a "Metaverse as WebGL Client embedded in a website" and focuses more on interpreting existing data types in a native client, as I pointed out with the differentiation of the images provided clearly within the post.

      I did not say, either, that the passive mode was the only manner to experience it, but only that it is the lowest hanging fruit to ubiquity before adding the asynchronous layer. Once you can interpret the passive spaces of the pre-existing web, you now have billions of spaces overnight without anything special other than the particular client which interprets it.

      Delete
  2. I still think if you try to combine a standard 2d browser and an immersive 3d browser into one client, you wind up with a compromise that doesn't serve either set of users well.

    Like Microsoft's new Windows 8 operating system. Too bulky and slow for a tablet, but with a UI designed for a touchscreen interface, offering the worst of both worlds.

    I believe you have to start with the user experience when designing something. Saying that "everything is data" is technically true -- you could argue that the entire universe is just data. But the user experience is very subjective, very personal -- and very, very difficult to get right. We'll probably have many different approaches before some visionary company comes out with something that clicks -- and then everyone will rush out and copy it.

    ReplyDelete
    Replies
    1. There is a distinct difference between a tablet interface and a standard computing interface. Both are two very different methods of working with context in a hardware sense. One uses a mouse and keyboard while the other assumes no mouse or keyboard.

      With what I've outlined, we make no such distinction, and already we utilize the same method of interaction by default and add other modes of interaction afterward.

      The biggest hindrance to the evolution of the Metaverse such far is the inability to stop thinking in an "either or" mentality.

      Just as quickly as you dismiss a hybrid browser you're quick to champion embedding a Metaverse plugin into a web page which you know full well is myopic in ability at best. What's I'm championing is the ability to do both contexts without sacrificing anything in the process, and doing so while merging interaction via a commonality. This makes a hell of a lot more sense.

      See also Damon Mile's comment below.

      Delete
  3. Bill, you know me. I have to add my two-cents to this as the very context of this conversation is what I'm currently working on with OpenSim and the Imprudence Client.

    Most notable facet to this entire conversation and your post is truly the "What" of 2D & 3D interaction. "What" to do with this information that regardless of which state we connect to it with (2D or 3D), "What" is it for and "What" do we do with it? We are presented with textual/binary data which in a 2D browser, is rendered into a Webpage. Hyperlinks, PDF's, Plugins & Add-ons, etc, are just secondary aspects to "What" to do with these streams of textual data.

    That same can then be said of a 3D browser (Metaverse client). It's just rendering that stream of data (still textual in delivery i.e. scripting language or XML) into three dimensional objects represented in a spatial environment instead of just object-oriented code that executes linear functions and tasks.

    At the core of this is ...basically just streaming bits of textual data that simply represents "something". Having a multi-render-capable browser that would asynchronously handle the latest in web technologies yet within the same frame/context window render that same data in 3D would I think revolutionize the way we think and use the "Browser". It's the "singularity" at work converging this information technology into something we can get a handle on.

    Its also the Chicken and egg syndrome. At the beginning of the IT revolution, its not so much just that computer graphics couldn't render 3D fast enough or well...but that it just wasn't practical. Most data represented on the web started as BBs Boards and Public Domain directories of files. Then that evolved into rendered pages to better display really that same information just in a more visual way. Technology got better and the ability to visualize the data turn into the rich media experiences we have now. But like Raymond Kurzweil puts it, Its just a matter of time before the need for technology to level off with how we consume. It is forcing a singularity. A causality of all these different technologies to come together to simplify its use and consumption into out lives.

    As pervasive as 3D has become on the internet and Social networking and communication and presentation has been advanced exponentially, it only makes sense to converge 2d and 3D rendering into one multifaceted "Browser" technology.

    I know you remember "CyberNet" Bill. That was a playful example of Convergence with a dash of Shadowrun and a lot of promise. I'm really wanting to bring that back but in a more practical yet grandiose way.

    ReplyDelete
  4. By the way, =IcaruS= isn't building a metaverse so red flag to that mistake. You need to understand what a VW is and what the metaverse is. we will clarify for you in a future post

    ReplyDelete
    Replies
    1. I know the difference. I was apparently giving you far more credit than you deserved. Thanks for pointing that out :)

      Delete
  5. Your quote "Addressing whether or not the next system has the appropriate rendering engine is a top down approach. We’re talking about the what instead of the how. This is a dangerous misstep and is usually the first mistake these endeavors make when attempting to build the Metaverse"

    ReplyDelete
    Replies
    1. You were writing from the premise of Second Life and OpenSim. So you'll have to forgive me for assuming that you weren't trying to build just another closed walled garden, considering that would have been counter-productive to actually moving forward in a bigger picture.

      Regardless, it's still a truthful assertion. Building a closed walled garden that focuses on the engine and instead of the functionality of interoperability or architecture is not a step to the Metaverse.

      In light of that, your prior comment of "I'm not building a Metaverse" is met with a "No shit, Sherlock." Because any way you slice it, you weren't going about it like you were or had any intention to. Whether it was on purpose or intentional was up in the air until you opened your mouth.

      Delete
  6. by the way i think you mean UN-intentional you tend to contradict yourself every time you open your mouth, I suppose that comes from not knowing what you are saying yourself

    ReplyDelete
    Replies
    1. You started your conversation based on the premise of Second Life and Open Sim. Second Life deciding to remain a walled garden and Open Sim being open and supporting the premise that it could expand into a Metaverse and not just a closed system. Therefore you intentionally began your conversation fully knowing you were talking about the possibility of Metaverse and not a stand alone virtual world through the very premise of context you set from the start. At best, you began on a line of bullshit in order to misdirect the reader into the intentions of your exploration into virtual worlds as something more in an overall conversation about the Metaverse.

      In order to attempt to save your own ass, you instead decide that you need to define what a virtual world is versus a Metaverse to attempt to sound intelligent.

      So let me enlighten you, because I actually wrote the standard definition of a Metaverse currently in use and cited through INRIA, IEEE Standard definition, Research, and academia.

      http://www.metaversestandards.org/index.php?title=Metaverse

      Multiple MetaGalaxy & Metaworld systems linked into a perceived virtual universe, although not existing purely by a central server or authority. Originally coined by Neil Stephenson in the book Snow Crash as a single Metaworld known as "The Street", the Metaverse today describes a collection of Metaworlds that are seamless and interconnected but largely decentralized which comprise the perceived virtual universe (metadata universe, metaverse) but follows closer to the overall persistence of a Metaverse from the roots of Cyberpunk genre in which systems much like the Metaverse existed but within the scope of many worlds and uses, foreshadowing a decentralized system of operation and scope. Theoretical Example: Interoperability between such systems as ActiveWorlds, SecondLife, BlueMars, and others by which a standardized protocol and set of abilities exist within a common interface which can traverse among virtual world spaces seamlessly regardless of controlling entity.

      To wit, the difference between Virtual World and Metaverse is context and functionality for interoperability. This is the underlining point you so ignorantly gloss over in your quest to sound smugly intelligent on a topic you seem to know little about.

      You started a conversation with a solid premise of Metaverse as nothing more than a misdirection by association. I thank you for correcting me about you're inability to actually pursue that road and instead just be yet another Skyrim Modder.

      Best of luck.

      Delete
    2. In retrospect - asserting that the guy responsible for the modern definition of the Metaverse doesn't know the difference between virtual world and Metaverse is probably the single most ignorant assertion you could have made.

      I merely followed the train of logic you set out in framing your post with Second Life and Open Sim as to mean that you may intend to follow suit and apply to be more than just another virtual world but a groundwork for a collective which could move to the Metaverse, based on newer game engine technologies and ability.

      In "moving beyond" Second Life, what you should have said more accurately was "I'm making a lateral move at best" and being just another virtual world in a sea of nameless virtual worlds already in progress.

      It is my fault alone for giving you more credit than you deserved in making a logical assumption you had a bigger plan in mind than just "let's screw with a game engine" to make another stand-alone virtual world.

      It's been done before. We called it BlueMars.

      So yes, forgive me for thinking you were actually doing something that may serve to advance the premise. In reality, it's just more of the same.

      Delete
  7. What you did was copy a mangled version to a wiki page ranked 10,139,601 (albeit it’s higher than your own site: 16,789,077 ) as opposed to where you got it from http://en.wikipedia.org/wiki/Metaverse ranked 6th with 2,138,557 links sites. Which do you think more people see? It hardly constitutes any achievement, does it?
    Along with writing standards, perhaps you could exercise some STANDARDS and spend time fixing the 357 HTML errors and 67 CSS errors, on your XHTML 1.0 Strict website? Empty rooms and echos spring to mind and you seem to think by association from reading The Book you are somehow responsible for it.

    In any case it still isn't sinking in to you what we are trying to do, so it’s probably best that you put away your red flag and go back to The Snow Crash Book and leave the actual doing to others. Hmm ... you proposed the idea of this HTML5 compliant Web browser/Metaverse Client. Have you built one rather than just doing the equivalent of scrawling a moon rocket on a piece of paper in crayon?

    Anyway rather than helping your pitiful traffic and keep replying and to you vacuous nonsense, I will leave you to try some ‘reply’ and let you stare at it in smug delusion while I click away and never bother reading your pointless posts again. You proclaim to be an advocate of a metaverse, yet you are far more destructive than you could ever be creative.

    ReplyDelete
    Replies
    1. This comment has been removed by the author.

      Delete
    2. This comment has been removed by the author.

      Delete
    3. Let me recap so far, just so we're clear.

      When I stated in this post up front:

      "I applaud =ICAURUS= for taking some initial steps to go about this, but immediately I’m throwing the flag out onto the playing field for a penalty."

      You apparently felt the need to correct me in stating you weren't pursuing a Metaverse. To that I did apologize openly for giving you more credit than was appropriate. However, that being said - "initial steps" begin with a virtual world, and the differentiation between that and a Metaverse is intention of context in whether such will be pursued or whether left to be a stand alone virtual environment. Taken into context with the topic of the overall discussion being the Metaverse, and steps to move beyond Second Life, there was no clear indication otherwise at that point when you left off with talking about Avatars -

      Therein I did thank you for correcting me and only cementing the original point in this post that what you were doing would not likely work if you were going for a Metaverse. What I was suspecting in this post was only proven by your comment. If you were going for a Metaverse, then what you were doing was likely not appropriate to achieve that - however this becomes a non-issue the moment you state you are making only a virtual environment because you state clearly the intention in the long run, and therefore are free to go about your business since it is appropriate for just a virtual world to be approached in that manner.

      Delete
    4. There simply was no further argument to be had from that point.

      Now, you could have left it at that and carried on. But no - and this is why I've merrily handed your ass back to you multiple times. Instead you chose to openly insult me by stating I did not know the difference between a virtual world and the Metaverse.

      This to which I replied that you had chosen a person responsible for the modern definition of the Metaverse to assert that they did not know the meaning or differentiation of a Metaverse. I then backed that claim with hard fact by offering the current definition, as cited by the IEEE Virtual World Standard Workgroup (P1828) and giving you the direct link to the public wiki where it is written. I also offered the pre-existing context of originating premise wherein I stated that definition was constructed for INRIA and Solipsis Decentralized Metaverse, which gave a presentation to IEEE with a working prototype for a serverless p2p Metaverse system with optimized and novel algorithms for area of interest networking methodologies. That was in 2006/2007 - though my involvement with the Metaverse Roadmap itself predates that with my name as one of the contributors - giving the same premise of decentralization schema that would later be utilized by Solipsis.

      Delete
    5. This in itself should have been enough to quell your argument, but you went ahead and did something so monumentally stupid as to defy logic and academic reason in a discussion which you've so arrogantly begun. You decided that the wiki link that I gave you was really just a bastardization of the Wikipedia entry definition, and decided to lower yourself further by attempting to change the topic of debate about credibility based on how many CSS errors and HTML errors the wiki had, and to make matters worse, you decided that *page rank* somehow creates authority on a subject, let alone Wikipedia in this discussion.

      Therein is your biggest and most ignorant mistake such far. For in what amounts to making an ill-informed attempt to assert I had somehow plagiarized the Wikipedia entry, you blatantly and quite stupidly forgot to check the actual citations of the the Wikipedia article in question which you were attempting to throw in my face.

      Had you taken that few minutes to check the citations, you would have quickly found that the Wikipedia article in question clearly and blatantly cites IEEE Virtual World Standard Workgroup P1828 in plain sight.

      So who is plagiarizing whom? Considering my title at the P1828 is Vice Chair, and the definition as differentiated on that Wiki (which is the public access version of the process as opposed to the IEEE Mentor site which you are not a part of) - and that definition was written by myself, as taken directly from the definition I gave to Solipsis/INRIA for the IEEE presentation years ago, it can only said that you have ridiculously somehow managed to accuse me of plagiarizing myself while citing a Wikipedia article that cites P1828.

      Congratulations.

      Are you finished making a mockery of yourself in public yet, or do you wish to continue this debacle? I'll gladly continue handing you your ass in public simply because you are making this far too easy.

      Allow me to address what I've *actually* done to further the Metaverse as a whole.

      Already you know about my involvement with Solipsis Decentralized Metaverse. That was fruitful with a working prototype as demonstrated to IEEE in 2006/2007 with full research findings in hand on the methodologies in practice to make that happen, and for others to build on. What you do not know, which at this point seems to be a hell of a lot, is that a dynamic system was prototyped as early as 1999 as I have described here in this post, but in a rudimentary fashion to test feasibility. Among all the people in Second Life, I can state definitively that I quite possibly am one of the few people on Earth who have contributed to and *actually used* the closest iteration of a Metaverse that ever existed to date.

      From those early prototypes and mockups, I took what I had learned from my involvement and helped define a better premise for a future looking Metaverse, which in turn became the defining foundation for Solipsis as an adequate chain of logic which fully described past, present and future iterations and the structures they would take - and for what reason.

      This, over many years of research and academic involvements on my behalf (and many others) has led to more solutions than you will ever hope to find in your virtual world project. As of this writing, the novel storage and retrieval system for a universal cloud system is in proposal to JPL/NASA in order to possibly offer a method of interplanetary file transfer which would allow exceedingly large files to be sent over the measly 125bit per second connection. This is because of research that started with bettering the Metaverse as a whole.

      Delete
    6. You see that comment up there above from Damon Miles? He's the guy who built the original protoype/mockup in 1999 of a dynamic Metaverse grid that I mentioned earlier in this comment. It was the closest thing to the Metaverse as described in Neuromancer that existed at the time, and as far as I am aware still holds that title. Do you have any idea what he is doing right now?

      No, you do not - but I'll enlighten you. He is currently pouring over the OpenSim viewer and server code and working out how to apply all of the methodologies I've described such far, and is privy to technologies that you could only dream about. In short, he's looking at the feasibility of applying the same methodologies he did in 1999 in order to make a Metaverse viewer capable of translating billions of websites on the existing web into instant access dynamic spaces - completely eliminating the barrier to entry for anyone with a website currently. He's also looking at the Area of Interest networking system pioneered through Solipsis in order to add a decentralized and ad-hoc networking layer to OpenSim and whatever else may utilize that code later on. He's also looking at the polymorphic data system as a possible solution to create a secure and universal asset system that is no longer centralized.

      Tell me - what is your virtual world project going to accomplish? I can guarantee you that it is nowhere near what you just read above.

      And do you know who he's working with to make that happen? If you guessed me, you'd finally be right about something in this discussion. So having the audacity to assert that what you are doing with your game engine and stand alone virtual world project is somehow far more important than my "do-nothing" attitude is yet another ignorant claim on your behalf.

      Delete
    7. I've done far more for the advancement of the actual Metaverse than you have ever accomplished in totality in this industry.

      Now that you have thoroughly been decimated on all accounts of your particular brand of fake intellectualism, it's time for me to add something on my own behalf.

      Any jack-ass with some knowledge of coding can pick up a game engine and build something. It takes a particularly low-brow jack-ass to assert that what they are building with that engine as a stand alone virtual world which is not a Metaverse is anywhere near as important as the bigger concept of the Metaverse - or that what they are doing somehow exceeds what I've contributed over the course of the past 15+ years. Whatever you are building with your game engine is about as important or monumental as the hundreds or thousands of other stand alone virtual worlds in the industry.

      That level of importance ranges from nothing at all, to barely a blip on the radar in the industry. In the case of BlueMars who had millions of dollars and access to the Crytek Engine - that alone is proof enough that what this industry needs is not another virtual world, or some coder proclaiming their importance with their virtual world offering, but instead what the industry needs is a collaborative to build the actual Metaverse.

      Unlike you, I've actually been hard at work to try and meet those ends. Everything from the foundation, to the very standing definition - and revolutionary technologies that will enable it in the long run. Because you see, I'm not looking at the short term - I'm looking for the long term and how it will change the entire world for real.

      Delete
    8. There are quite a lot of people who are in the same position and have spent many years of their lives in devotion to that future, mapping protocols, building prototypes, writing research and findings, and even coding existing Open Source systems in their spare time as a labor of love.

      How *dare you* step foot on this blog and denounce that. How *dare you* come here and insult me and equate all that I have done to nothing more than a point of trivial concern for you to flippantly and ignorantly dismiss as if you have absolutely *any* understanding of the subject at hand or what those who are actually involved with that future have contributed to those ends.

      But if you need even *more* proof of my expertise on this subject - then let's try something more recent. How about the recent ACM research paper wherein the same definition is in use, and clearly bares my name. One of he most respected journals in the world. Did you know that particular research paper has been requested for use by Washington State University as part of their Virtual Worlds Certificate course? Did you know that it received glowing praise from the reviewers and recommendation for Paper of the Year?

      Did you know one of the reviewers that gave us that glowing review was a colleague of Jaron Lanier, who commended us for our accuracy in defining the Metaverse and the technologies to innovate research further?

      Jaron would have been proud of that. It means a lot to me to be able to impress even the close colleagues of Jaron Lanier because they among anyone know what virtual worlds are and what they can become. It is an honor to have university courses taught using my own research on the Metaverse and its future as the primer. It is an honor and pleasure to help Masters and Doctorate students with their own research when they contact me to cite my works for their own academic research.

      And here you stand, an ignorant a belligerent little sh*t in public humiliation and dishonor for asserting a knowledge of the industry and the work that has gone into it that you do not have on any account. Here you are trumping your virtual world project like it's something important in the grand scheme of things while attempting poorly to take shots at what I have done and contributed in the grander scale.

      I don't rightly care what your virtual world is about. It's virtually nothing in comparison to the greater Metaverse and all of the hard work and years of attention many have given to something you feel I am too incompetent to address. You have done nothing more than to decide to carve a niche within an existing niche.

      And I would have said nothing about it and let you on your way had you enough sense not to imply my ignorance on the subject of the Metaverse and virtual worlds.

      You threw the first punch, and now I'm intellectually knocking you clear on your ass in public view for all to see.

      Go play with your game engine. Best of luck - you're going to need it.

      Delete
    9. The simple fact of the matter Leon, is that you just had your ass handed to you. Go back to your blog and cry about it. Pretend like you had an argument at all or made any point other than to set yourself for humiliation on a grand scale.

      You simply aren't *man* enough to own up. And that's why this will be the final word in this discussion.

      Delete