The author states that the code was written by Opus, and afaik AI-written code is not considered copyrightable. Without copyright on the code, there should not be something prohibiting you from running it or even redistributing it. Of course this may come down to the extent of human contribution.
fwiw, the opposite view was recently taken for node js. GH hides most of the conversation by default due to size, but the gist is that a new VFS feature is being proposed for node that was largely written using an LLM. I think I've more often seen the view these days that if you used, steered, and likely modified code generated by an AI, you generated the code and hold the copyright.
...you can't redistribute code without a license, but surely you can legally run it, can't you?
Like, if I write a blog post and put it on my blog, you're allowed to read it, right?
Heck, if my blog contains some Javascript code I wrote, I would imagine your web browser is allowed to run that code without opening you up to copyright infringement, even if I didn't provide an explicit license.
For what it's worth, there is an official, daily updated public dataset of all posts and comments. Therefore, the data clearly isn't something they consider a trade secret.
> Programs are downloaded to my computer and executed without me being able to review them first—or rely on audits by people I trust.
JavaScript and WebAssembly programs are always executed in a sandboxed VM, without read access to the host OS files (unless, of course, you grant it).
Enabling scripting was a necessary step for interactive websites. Without it, a full page load would be required every time you upvote a Hacker News comment. In my opinion, the real problem is that browsers allow too many connections to third-party domains, which are mostly ads and trackers. Those should require user-approved permissions instead of being the default.
The Triptych Proposals [1] cover a lot of common use cases for submitting information to a server and updating part of a page. Something like that should have been possible to implement early in web history (I perceive some similarity to frames).
Modern CSS (and some newer HTML features) also reduces the need for scripting.
I very much doubt that "Enabling scripting was a necessary step for interactive websites." (emphasis added). It may well have been the most convenient and fastest way to get the functionality to the most users. With Javascript each website could provide functionality without waiting for such to be implemented by all browsers.
However distribution of power also leads to more complex trust relationships (even if one is confident that sandboxing is effective). Independent implementation also leads to more complexity overall.
In the world we have now, limiting XMLHttpRequest and Fetch to the same host as the current page would be great. But if that had always been the limitation, I fear that the adware peddlers would have just gotten proficient at shipping PHP packages/extensions that you could run on the same server as your site, and we'd be in largely the same situation, except that blocking the stuff would be harder than it is for us today.
> Access to Internet is possible inside the emulator. It uses the websocket VPN offered by Benjamin Burns (see his blog). The bandwidth is capped to 40 kB/s and at most two connections are allowed per public IP address. Please don't abuse the service.
It looks like container2wasm uses a forked version of Bochs to get the x86-64 kernel emulation to work. If one pulled that out separately and patched it a bit more to have the remaining feature support it'd probably be the closest overall. Of course one could say the same about patching anything with enough enthusiasm :).
I looked at Prisma, I very much prefer the Protobuf/Thrift model of using numbers to identify fields, which allows 2 important things: fields to be renamed without breaking backward compatibility, and a compact wire format.
I think the Protobuf language (which Skir is heavily influenced by) has some flaws in its core design, e.g. the enum/oneof mess, the fact that it allows spare field numbers which makes the "dense JSON" format (core feature of Skir) harder to get, the fact that it does not allow users to optionally specify a stable identifier to a message to get compatibility checks to work.
I get your point about "why building another language", but also that point taken too far means that we would all be programming in Haskell.
reply