aboutsummaryrefslogtreecommitdiff
path: root/src/tokenizer.cpp
diff options
context:
space:
mode:
authorAndrew Kelley <andrew@ziglang.org>2019-10-04 17:39:35 -0400
committerAndrew Kelley <andrew@ziglang.org>2019-10-04 20:18:06 -0400
commitdca6e74fec247f2d15de3cfad9e0e23bc7884212 (patch)
treedc409bcd1ce27e92715dc859daf767f093457e72 /src/tokenizer.cpp
parent2f4dad04e06005318ae3f37798e734a63edef6c0 (diff)
downloadzig-dca6e74fec247f2d15de3cfad9e0e23bc7884212.tar.gz
zig-dca6e74fec247f2d15de3cfad9e0e23bc7884212.zip
proof of concept of stage1 doc generation
This commit adds `-fgenerate-docs` CLI option, and it outputs: * doc/index.html * doc/data.js * doc/main.js In this strategy, we have 1 static html page and 1 static javascript file, which loads the semantic analysis dump directly and renders it using dom manipulation. Currently, all it does is list the declarations. But there is a lot more data available to work with. The next step would be making the declarations hyperlinks, and handling page navigation. Another strategy would be to generate a static site with no javascript, based on the semantic analysis dump that zig now provides. I invite the Zig community to take on such a project. However this version which heavily relies on javascript will also be a direction explored. I also welcome contributors to improve the html, css, and javascript of what this commit started, as well as whatever improvements are necessary to the static analysis dumping code to provide more information. See #21.
Diffstat (limited to 'src/tokenizer.cpp')
0 files changed, 0 insertions, 0 deletions