9 /


troff->html,troffdown->markdown, postscript thing->web layout engine, 9p->http

9p could hyperlink all our man pages. A single mouseclick would let you browse documentation (and research papers written in troff) from anywhere in the world.

Automatic archival of all 9p content via cwfs. Make dumps like the internet archive but applied to every file, giving you a wayback machine snapshot of everything on 9p. Is this the web we have all been waiting for?

If the publisher is responsible for making the dump, what if the publisher dies? Then his 9p file hierarchy *and* his backups go offline simultaneously, giving us zero redundancy. Cwfs gives you this nice /n/dump/2023/0829/ snapshot, just like internet archive does. Ideally a daily snapshot should be distributed among the reader base to avoid a single fault system. I suspect Cwfs could easily make its dumps to a remote file server. We would then be able to pull up really old archives of books, journal articles, research papers, and man pages -- drm free

A new filesystem that takes daily snapshots like cwfs but stores it over torrents. Poor man's FS -- PMFS

CGI for files and directories (FNS would be handling dynamic generation of file paths). Perhaps see execfs.

Imagine if a single file had 3 different views: /path/to/file.{c,lua,lisp} . These 3 source files could be generated dynamically by some compiler from a single source file. Or perhaps three texts /path/to/blog/{chinese,spanish,french}, all generated from /path/to/blog/english.

Binaries can also be generated dynamically over the network so that users do not need to compile the software.

Because 9 can boot from a remote filesystem, developers can work easily with dumb terminals.

The remove filesystem may enable a universal hardware lab, since it should be possible to easily cross compile software for every architecture on the distributed grid.

Both p9p and 9fork (ircnow's custom fork) could be written mostly in lua to speed up development.