Skip to main content

Self-distributing a web-extension with IPFS

Self-distributing a web-extension and getting traffic isn't easy, especially when you decide to use technologies that aren't known by your common man. I did it anyway because it's interesting.

About a year ago, I forked a web-extension for Firefox called Containerise by the honorable kintesh because I wanted to implement quite a few new things. Additionally, there were big changes to be made to the code base, of which the major one was getting rid of webpack.

Primarily, I'm not a fan of asset compilers, pre-compilers, etc. It's understandable to use those tools in big projects where one deliverable is desired to reduce page loading speeds, things have to be minified, different formats have to be compiled into one that a browser can understand, and so on. My biggest gripe with webpack however is the complexity of the tool. Anyway, that's not what this is about...

The result thereof is Bifulushi, which is a webpack free web-extension that runs without a compile phase in between.

Publishing on (AMO)

While getting rid of webpack, at the same time VueJS was introduced to make development of UIs easier; WebComponents in Vanilla JS are quite an unpleasant experience.

VueJS and other frameworks like it (react, angular, ...) use eval() in order to help compile expressions in their components. One has to enable that with "content_security_policy": "script-src 'unsafe-eval' 'self'; object-src 'self'". That gives scripts running in the extension the ability to call eval(). Without it, where eval() is found or called, it just fails with an error in the browser console.

Since the offending script (VueJS) is run by the extension in secure context in firefox where third-party scripts from the internet aren't loaded, that should be OK. Well... that doesn't seem to be the case. AMO has a strict "no unsafe-eval" policy and their reviewers will shut down any extensions that violate that.

Unless... somehow you have some clout and can force their hand. My reviewer said some extensions have been given exceptions, but which, why and how, wouldn't be revealed.

Sound familiar?

Walled gardens

Yep, Apple recently got in trouble with Epic because they are prosecution, judge and jury of which apps can be installed on their phones. They will claim it's for the security of their platform, which is a point I can understand, but as with many things Apple, that's just an excuse to have a tight grip on their customers. It's quite difficult for competition to emerge against Apple on its own platform, because Apple will just ban, buy or integrate the app.

Mozilla makes the same claim of security, but in their defense, it's possible for people to install apps that weren't signed by Mozilla. All they have to do is allow the installation of any unsigned app. It's not possible to add another "store" to the list of trusted stores for Firefox.

That's... better than Apple, but not that great either.

The final option: signed by Mozilla but self-published

If you don't want to learn how to sign your own addons - because the documentation is waaaay outdated and it's frankly useless since Mozilla won't let you use it anyway - you can let Mozilla sign your addon.

Mozilla can reject your extension from AMO but will still sign it and allow you to distribute it

Personally, I honestly don't get it. If your extension is doing something so bad that they won't publish it themselves, why sign it and let Firefox install it? I can only guess it's to keep the keys firmly in their hands and make it a little harder for users to install your extension. They must be trying to balance security, freedom and control.

In any case, this is the path taken for Bifulushi.

Distribution with IPFS

That was a long foreword, but we got here. So what had to be done?

Following Mozilla documentation on Firefox Extension Workshop:

  • Build your extension
  • Pass AMO the extension to get it signed
  • Publish the extension somewhere where people can download and install it

For allowing updates, additional steps have to be taken:

  • Add an update_url to your manifest.json It should point to an update.json with a list of URLs to the different versions of your extension
  • Update the json when publishing a new version

How is this done?

I didn't want to do this manually everytime a new version was released, and I was hoping that the extension would be available beyond the lifetime of my server. The former can be done with CI (in this case Gitlab's CI) and the latter with IPFS.


  • Git tag commit
  • Push to Gitlab
  • CI builds, signs and pushes the signed .xpi (extension) to a server
  • Server:
  • add new .xpi to IPFS
  • Regenerate update.json
  • Add folder containing update.json to IPFS
  • (Manually) Update DNSlink entry on domain
  • (Manually) Update README with link to new version

You can look at .gitlab-ci.yml to see the CI part. It's fairly straight-forward.

Pushing to a server

This needs a little more explanation and is the major part of the work done.

ssh has the possibility to force connections that use a certain public key to execute a specific command. There are lots of other documented options. An example of the command option

command="mycommand" ssh-rsa ABCDE...

And to execute it ssh -i yourkey user@yourserver. With that key, only the command specified will be executed. Commands executed this way get their args in the SSH_ORIGINAL_COMMAND environment variable.

They also accept input from stdin! This allows things like ssh -i yourkey user@yourserver < somefile.

...which is exactly what was used in this case. The major reason was to limit the size of the data being sent.

The command on the server

A self-written tool with a bad name was written. Welcome pexfs which stands for Publish Web-Extension to IPFS.

The principle is simple:

  • Accept data from stdin with a max size
  • Store it in $parentDir/$addonID/files/$version.xpi
  • Regenerate $parentDir/$addonID/update.json from .xpis in files/ directory
  • Add $parentDir/$addonID to IPFS and print out the hash

Update DNSlink entry on domain

IPFS has a great feature that allows adding IPFS information to a domain / DNS entry. It's called DNSlink and it's pretty simple: Add a TXT entry to your domain with dnslink=/ipfs/... or dnslink=/ipns/....

Now, browsers or extensions that know what to do with it, can use it. I read somewhere that a browser exists that can use it to redirect users to the IPFS path. Some extensions can do the same. uses it to resolve the hash and in our case points to the uploaded folder.

That DNS entry unfortunately has to be maintained manually for now, but that will change.

Update README with link to new version

The last link to update is the one in Bifulushi's README that points to the new version e.g .

What's left to do?

The work is never done and even though the result is a self-updating, self-publish extension, the user XP is still quite lacking.

Mirrored IPFS pins

Pinning on IPFS is the action of dedicating the node to keep a certain item for the long-term. The details aren't too clear to me, but I assume there's an Last Recently Used (LRU) cache of items a node downloaded. Once the cache fills, items can replaced in the cache. Pinning move items out of the cache into a more permanent location; an action the node operator can of course undo.

The IPFS node is currently the only one pinning the extension.

More nodes would have to pin the same data as the original server called by CI.

Allow a custom HTML page beside the update.json

Currently, the link to the latest extension version is updated manually in the README. It would be much better if the link simply pointed to an HTML page that was updated automatically by the CI.

Of course the page would need to generated somehow and all assets pushed to IPFS somehow.


It's not easy to find the extension at the moment. Once the page is generated, it would have to be published somewhere.

Update DNSlink entry with CI

The DNS entry doesn't have to be done manually. .gq domains are on freenom, which doesn't have its own API but there are tools out there to circumvent that which emulate a browser. Luckily freenom doesn't use a fancy UI full of JS, so some clever HTML parsing and simple HTTP requests can achieve the desired result.

Maybe the next time I write something, these tasks will have been accomplished ­čÖâ