The name is OpenLara (https://github.com/XProger/OpenLara ) and you can try out the WebGL build directly on your web browser on: http://xproger.info/projects/OpenLara/ . The web version works amazingly well on my Pixel 7a with touch controls (you have to click on the “go fullscreen” button) using Firefox as a browser.
Compared to what?
Compared to native platforms.
Okay, I have to admit that that’s leaving me a bit nonplussed. Assume for a moment that I am concerned about the security implications of running an open-source Tomb Raider engine implementation. How exactly are you proposing running this in a more-secure fashion?
If I run an executable on my platform – say, an ELF binary on Linux – then normally that binary is going to have access to do whatever I can do. That’s a superset of what code running inside a Web browser that I’m running can do.
Are you advocating for some form of isolation? If so, what?
EDIT: And I’ve got another question for you. Let’s say that you’re worried about security of browser APIs. How do you avoid this? Because if your browser is vulnerable to some exploit in its WebGL implementation, not clicking on a link explicitly labeled as going to a website that uses 3D – which is what you appear to be urging people to do – isn’t going to avoid it. Any site you browse to – including those not labeled as such – could well expose you to that vulnerability.
EDIT2: In another comment, you say that you want to trust the “kernel” instead of the browser. Okay, fine. There are a whole class of isolation mechanisms there. What mechanism are you proposing using? Remember that you are needing to give access to your 3d hardware to whatever software package is involved here, and the Linux kernel, at least, doesn’t have a mechanism for creating virtual, restricted “child” graphics devices. The closest I can think of on Linux you can get at a kernel level there would be pass-through from a VM to a dedicated graphics adapter, which probably isn’t going to be an option for most people and I have doubts about being a carefully-hardened pathway compared to browser APIs.
Kernel sandboxing. I mean, breaking out of browser “sandboxes” is a game these days.
Which is why using the web without JavaScript is a security measurement which I strongly recommend to enable. Sure, many sites will be “less interactive” then, but I’m afraid that it is the only solution. For the usually: rather small number of websites which you absolutely need to use with JavaScript enabled (do you, really?), a separate browser inside a container (or VM) would be a good option. I admit that this is not the most comfortable setup, but I really prefer to be safe than sorry. YMMV, but you asked.
That’s a class of different mechanisms. I updated my comment above. I’ll repeat the text there:
Virtually every website out there today uses Javascript. Lemmy uses Javascript. What makes this particular website a risk?
Yeah, I do. Fifteen years ago, I used NoScript, and some things broke, but it was usable; there were enough people running non-JS-capable browsers that websites had a reasonable chance of functioning. The Web generally does not function without Javascript today.
Most of those work without it.
Lemmy is one of several ActivityPub-capable applications. You do not need to use Lemmy inside a web browser in order to participate here. In fact, you don’t even need to use a web browser.
I disagree. Some websites (with lazy developers) work less well without JavaScript. You’ll gain less annoyances (no JS = no pop-ups and no sophisticated anti-adblock techniques), more speed, less energy consumption, less potential security risks. You’ll lose… not really much. “Web applications” (usually worse, slower and less reliable than installed software), a couple of websites which are very focused on providing effects over contents - sounds like a fair deal to me, but again, YMMV.
Yes, there will never be absolute security. If it runs on a computer, it most likely has security flaws.