Spent a lot of time over the weekend getting webassembly jammed into my brain and tooling up.
I looked hard at mono-wasm (now built into the mainline distribution). It was generating huge wasm files, so putting lots of code I wasn't explicitly adding into it. I have to look much harder at this. It seems like they're making an attempt to get the .net foundation classes implemented (though a lot of the code basically throws a "Not Implemented" error). Still looks promising. I'd rather the first gen emissaries be c#. It's not really focused at WASI and embedding yet. I think the emphasis is mostly for browser use and the DOM.
Discovered that there's a PyPi module with wasmtime in it, so I can embed that and make a test framework. Trying to get my hello-world modules working in Rust and establishing the tool-chain for all of that.
They split out the wasmtime .net framework into a new package. I think perhaps that they utterly redid the API, though it seems much easier to use. It doesn't have to discover the host-wasi modules via reflection, so I can have a call which sets up which api's are available to each class of emissary (a method call can define each method).
I need to learn webassembly assembler. It's been a good 40+ years since I did assembler regularly, but this one might end up being the ultimate superset assembler. It would be interesting to build a real assembler for it (rather than the lisp-like fully regularized syntax it has now).
VR Worlds is a concept for building, essentially, an VR/AR (XR) Operating system, which allows any number of worlds to operate together and run in modern scalable containerized servers. Anyone can field their own worlds and so make their own rules rather than some giant top down architecture.
Monday, April 13, 2020
Sunday, April 5, 2020
2020 Has Eaten My Brains...
Wow. I've had a busy several months. I think about this project quite a lot, but have had little time. The SF conventions that I normally spend coding on this stuff have been cancelled. Alas Norwescon...
Getting the First Person framework working on the loading deck. I've got the VR integrated, but needing to get all the movement stuff working. Unity is having problems with the SteamVR plugin with some of it's more recent versions. I'm trying to run on 2019.3, but still have to remove several of the integration methods to get it to compile. I'm going to try to get teleport movement working. Supposedly Valve is going to write a new version of steamVR (or OpenVR--not exactly sure). This might limit me to the 2019 version of unity for a while. I think--rereading all this and updates, that I need to start using the XR framework.
Keeping an eye on Wasmtime. They're doing good work there. There's a new nuget preview. It looks like if you look in the sources though that they're changing and simplifying the calling structure. It looks like there's going to be a much easier way to integrate the private APIs I need to expose to it via private WASI. You can just call a method to define a function. So I can start getting together some classes to autogen the APIs. It also lets me define different APIs and only load the ones that are appropriate (entities, avatars, and worlds likely have variations in what APIs are present). And eventually will start to implement the standard WASI interfaces that are appropriate (like file IO, but only to the sandboxed and read-only emissary filesystem). It's still looking like Rust for the first language for doing Emissaries, though I need to see how other systems are working. Blazor is going nicely in the dotnet/c# world, but it works rather differently -- There's a CLR interpreter running in webassembly -- I might look into this and see if this is sufficiently well documented to write my own CLR classes for it or if I'll need my own webassembly module (it may be assuming a DOM and other non-embedded things--though eventually, it might be worthwhile having a working DOM).
In the just-fun category, I had an interesting idea. Someone has an actual firmware VT100 emulator running on a virtual javascript 8080 cpu. This is not what is conventionally known as a "vt100 emulator". This truly is running the rom code from a vt100. It was be possible to build one of these into an emissary (I see some 8080 emulators in rust). I was originally thing that these emulators would run server-side and expose themselves in the VR environment via a pseudo-serial connection. So, it would be truly interesting to do this with the other SIMH emulators and Mame and libretro. These are mostly in C, it might be possible to get them into webassembly. Mame has some other VT emulators built into them as well as the million other things Mame can do. I hear rumors that there's an SDL layer that's been ported to webassembly (though probably for a DOM Canvas, but I can probably map that onto a texture).
Next big effort too is to get the various multi-process servers to bidirectionally ping each other and shut down when the browser exits. There a ping that occurs the when the service comes up to indicate that it's ready. I also had the thought to have some kind of indicator of presence and health on the loading deck, like a luminous sphere for each server. Color could indicate load or CPU %. It was a bit of an epiphany that I could start having simple emissaries that don't really require an external entity server, and can show them onto the loading deck. Getting all of the pieces together for loading external content is still a ways off. The initial avatar won't really need any emissaries, until we want to start showing a second-person view. Though we might eventually show hands and such in first-player view--though that might be initially hardcoded and rely on what is available in SteamVr.
I had a sketch of the necessary emissary APIs, so I will also start fleshing those out. Those need some practical hammering so we can come up with some nice coherent architecture for that. Mostly it allows the emissary to manipulate the 3d object which it controls (via hopefully the new Entity framework in Unity). There has to be a bidirectional message pump between the browser process and the emissaries--the emissary requests a change and the data gets passed to the browser and implemented (or it requests a callback to return info back to the emissary). I need to see how GRPC has progressed in the last few months. Perhaps the windows GRPC via named pipe has been implemented in .NET and we can have very fast communication for these message loops.
I'm also seeing lots of nice API documentation on github. Need to work on that for the new code.
Found a Named Pipe implementation for GRPC.
Getting the First Person framework working on the loading deck. I've got the VR integrated, but needing to get all the movement stuff working. Unity is having problems with the SteamVR plugin with some of it's more recent versions. I'm trying to run on 2019.3, but still have to remove several of the integration methods to get it to compile. I'm going to try to get teleport movement working. Supposedly Valve is going to write a new version of steamVR (or OpenVR--not exactly sure). This might limit me to the 2019 version of unity for a while. I think--rereading all this and updates, that I need to start using the XR framework.
Keeping an eye on Wasmtime. They're doing good work there. There's a new nuget preview. It looks like if you look in the sources though that they're changing and simplifying the calling structure. It looks like there's going to be a much easier way to integrate the private APIs I need to expose to it via private WASI. You can just call a method to define a function. So I can start getting together some classes to autogen the APIs. It also lets me define different APIs and only load the ones that are appropriate (entities, avatars, and worlds likely have variations in what APIs are present). And eventually will start to implement the standard WASI interfaces that are appropriate (like file IO, but only to the sandboxed and read-only emissary filesystem). It's still looking like Rust for the first language for doing Emissaries, though I need to see how other systems are working. Blazor is going nicely in the dotnet/c# world, but it works rather differently -- There's a CLR interpreter running in webassembly -- I might look into this and see if this is sufficiently well documented to write my own CLR classes for it or if I'll need my own webassembly module (it may be assuming a DOM and other non-embedded things--though eventually, it might be worthwhile having a working DOM).
In the just-fun category, I had an interesting idea. Someone has an actual firmware VT100 emulator running on a virtual javascript 8080 cpu. This is not what is conventionally known as a "vt100 emulator". This truly is running the rom code from a vt100. It was be possible to build one of these into an emissary (I see some 8080 emulators in rust). I was originally thing that these emulators would run server-side and expose themselves in the VR environment via a pseudo-serial connection. So, it would be truly interesting to do this with the other SIMH emulators and Mame and libretro. These are mostly in C, it might be possible to get them into webassembly. Mame has some other VT emulators built into them as well as the million other things Mame can do. I hear rumors that there's an SDL layer that's been ported to webassembly (though probably for a DOM Canvas, but I can probably map that onto a texture).
Next big effort too is to get the various multi-process servers to bidirectionally ping each other and shut down when the browser exits. There a ping that occurs the when the service comes up to indicate that it's ready. I also had the thought to have some kind of indicator of presence and health on the loading deck, like a luminous sphere for each server. Color could indicate load or CPU %. It was a bit of an epiphany that I could start having simple emissaries that don't really require an external entity server, and can show them onto the loading deck. Getting all of the pieces together for loading external content is still a ways off. The initial avatar won't really need any emissaries, until we want to start showing a second-person view. Though we might eventually show hands and such in first-player view--though that might be initially hardcoded and rely on what is available in SteamVr.
I had a sketch of the necessary emissary APIs, so I will also start fleshing those out. Those need some practical hammering so we can come up with some nice coherent architecture for that. Mostly it allows the emissary to manipulate the 3d object which it controls (via hopefully the new Entity framework in Unity). There has to be a bidirectional message pump between the browser process and the emissaries--the emissary requests a change and the data gets passed to the browser and implemented (or it requests a callback to return info back to the emissary). I need to see how GRPC has progressed in the last few months. Perhaps the windows GRPC via named pipe has been implemented in .NET and we can have very fast communication for these message loops.
I'm also seeing lots of nice API documentation on github. Need to work on that for the new code.
Found a Named Pipe implementation for GRPC.
Subscribe to:
Posts (Atom)
Random Thoughts, Some Progress
Lots of work at work, so not as much time for projects. Smattering of thoughts: Godot doesn't really seem ready for prime-time with c...
-
So, the original idea for the _payload directory in the Emissary was to allow the initial state of meshes and whatever other assets might be...
-
Interesting article on asymmetric cryptography... Don't Use RSA So, looks like I will be using Elliptic Curves with Curve 25519 . ...
-
So, I wasn't really thinking that the V8 environments in the browser would be truly compatible with node or chrome. That leads to issue...