VR Hackfest 2.0

Do you remember back in the summer of 2019 when we held the first VR Hackfest at Mitchell Park? In less than 3 hours we not only showed attendees how to create simple, immersive 3D scenes using a technology called A-Frame, but also how to publish it online for free using an internet protocol called IPFS. This hands-on coding workshop was one of the most popular tech events hosted by the library that year! We even published a paper about it for other librarians to read, and took the entire workshop idea to CENIC (also known as the Corporation for Education Network Initiatives in California). 

Enough about us though—where is this technology now, four years later? Read on for a quick recap about how the two core technologies we presented at the VR Hackfest, A-Frame and IPFS, have evolved. The best part? It's still completely free to use, and very easy to get started with just basic HTML skills. If you can code a simple website, you can starting building VR today!


A-Frame is a web framework for creating virtual reality experiences, and it has gained many new features that have made it even easier to use for both experienced developers and beginners. Like the name implies, A-Frame is essentially a pre-fabricated kit for building a specific kind of house, not unlike the famous Eichler Houses around Mitchell Park itself. 

This "tech kit" is useful because it comes with all the necessary materials, instructions, and tools for building that specific type of website. You don't have to start from scratch, and the list of new tools in this toolkit has continued to grow since the early days of 2019.

Here are some new abilities to try out in your VR scenes:

  • You can now add physics engines, which enable realistic interactions between objects. This means that objects can have gravity, collide with each other, and be affected by forces. They have also added procedural geometry to create complex shapes and structures with just a few lines of code.
  • Hand-tracking is another new feature which allows users to interact with objects in the virtual environment without a gamepad or separate controller. Support has been added for the latest VR headsets, including the Quest Pro (the latest one from Meta/Facebook). This includes features like WebXR foveation, which more and more VR headsets are starting to use. 
  • Last but not least, the A-frame code inspector has been improved and bugs have been fixed, making it more reliable and easier to work with. With the code inspector you can see how other people built their A-Frame creations and modify just like HTML. It's like being able to see the underlying code in The Matrix. Whoa! 

Interested in learning more about A-Frame?

Visit the A-Frame School website for interactive tutorials that teach the basics. This site will take several hours to work through if you follow each code example hosted on Glitch.me. Soon you'll be a VR star 🤩


In the second part of our workshop, we collection of 3D scenes people created and published them to Internet using the cleverly titled Inter-Planetary File System (IPFS), a distributed web technology developed in Palo Alto by Protocol Labs. In 2023, IPFS is still a cutting-edge technology that allows anyone to publish to the Internet without a server, through a peer-to-peer network that works seamlessly with the regular Internet. How can something be cutting edge while still being 4+ years old? Well, nowadays people talk about web3 like it's a new thing, but the starting point for web3 was "decentralized web" projects like IPFS. 

Like the A-frame developers, Protocol Labs and their community of open source code contributors have also been very busy building out this tool set. They are in fact very close to launching something very interesting called the Filecoin Virtual Machine and over the past several months have been working with Lockheed Martin to deploy IPFS in space.

Explaining how IPFS works takes at least 30 minutes for the uninitiated, and has had more changes in the last 4 years than I can fit into a single blog post. You might recall hearing about IPFS when NTFs were gaining steam in 2022. Molly Mackinlay, Head of Engineering, Product, & Research Development at IPFS gives a nice summary of the project's progress in the video below.

Ready to go offworld with IPFS?

Join ProtoSchool! This is the official training source for IPFS, sponsored by Protocols Labs. Bring your scuba gear, it's deep dive! 🤿 Protocol Labs also has a very active YouTube channel with tech talks and publishes their twice weekly IPFS developer meetings.

Future Work

Now the fun part. Just imagine what will be possible in another year with recent developments from OpenAI! Imagine being able to create your own virtual world, complete with interactive objects and realistic physics, using a few lines of A-Frame code and your ChatGPT co-pilot by our side.

Imagine being able to share that virtual world with anyone in the world, without relying on a centralized server or marketplace. Everything described in this blog post can be done DIY style, easily. You do not have to ask permission or sign any paperwork. That's the power of A-Frame and IPFS, and one of the reasons why we keep talking about it at Palo Alto City Library.

If you have any questions or want to learn more about how to get started with A-Frame and IPFS, let me know what you're thinking about doing. At the very least, I can share some links to online courses in our Upskill databases that will help you get started, and for pros out there, we have a brand new space called the ReBoot Room that could host your workshop!