fun: fun presonal project research. gpt token optimizer

master
Hung 2023-06-03 23:48:38 -07:00
parent 969834a1ed
commit c70a653099
2 changed files with 49 additions and 48 deletions

View File

@ -1,8 +1,9 @@
;; basically transpiled from Fennel's website https://fennel-lang.org/see
;; Lua (git:603e6cc, ->fennel->lua to remove docs): (reduced) 4000 GPT-3 tokens, 3800 Codex tokens, 9800 chars
;; Fennel (git:603e6cc): 5106 GPT-3 tokens, 3488 Codex tokens, 10110 chars
;; Fennel (git:603e6cc, manually removed whitespace)
;; Fennel got hit mostly by white-spaces
;; Fennel got hit mostly by white-spaces in GPT-3
;; Fennel (git:603e6cc, manually removed whitespace): 4048 GPT-3 tokens, 3443 Codex tokens, 9051 chars
(set buf-select.QBuf {})
(set buf-select.QBuf.__index buf-select.QBuf)

View File

@ -45,7 +45,7 @@ TL;DR:
- Buffer-oriented parser
- `local buf_parser = vim.treesitter.parser(bufnr: number): LanguageTree`: Buffer-specific parser
- `buf_parser:trees()`: Gets into the nodes within the buffer "host" language (usually based on filetype)
- `buf_parser:children(): table<language(string), LanguageTree>`:
- `buf_parser:children(): table<language(string), LanguageTree>`: Treesitter injection!
```lua
-- Parser for current buffer