The Arcology Garden

Local parsing of KOReader Notes to Org Roam

LifeTechEmacsArcology

Run this code: shell:pushd ~/Code/koreader-to-org && nix-shell shell.nix --run "fennel command.fnl" &

Locally-synced, transparent files can encourage diverse user-agents

This document serves as Literate Programming for a script to extract my notes from my roam:KOReader sync directory and output them as org-mode fn1. These notes are then accessible and linkable from my org-roam Knowledge Base and can be directly integrated in to my thinking, or indirectly through the creation of Topic Files for each book with SRS cards in them.

I've been using the highlights json export plugin in KOReader to get KOReader Notes extracted adjacent to my Archive . it's fine enough but a bit rickety since on Android the notes exporter fucking crashes without a backtrace, but because I have my books directory managed by Syncthing instead of a roam:Calibre or OPDS distribution, i can kludge stuff like making KOReader on My NixOS configuration via container and running the export plugin there, or abusing bind-mounts on my Linux system to trick the Android app. roam:This is Fine , but it's oh so rickety and kludgey and basically doesn't work well enough.

But Syncthing , oh friendly Syncthing! having files all in sync introduces the possibility of multiple user-agents. I don't even need to use koreader to parse these!

I can start explaining by executing something like:

shell source: :results drawer :exports both
find ~/ebooks/books -wholename "*sdr/metadata.epub.lua" -type f -print0 | head -z -n 1 | xargs -0 head -n1
-- we can read Lua syntax here!

"we can read Lua syntax here!" 😎

"And so can I" 😈

let's write some Fennel . Fennel can read roam:Lua syntax ;-) Fennel and Lua are of course in nixpkgs , I'll make a roam:Nix Shell to set up an environment.

Nix Shell

the nix shell imports hashings, which uses nums, and creates a little environment with penlight, lua, hashings (defined below), and find.

nix source: :tangle ~/Code/koreader-to-org/shell.nix :mkdirp yes
{ pkgs ? import <nixpkgs> {}, ...}: let lua = pkgs.lua5_3; myLuaPkgs = import ./pkgs.nix { inherit pkgs; inherit lua; }; myHashings = myLuaPkgs.hashings; myLua = lua.withPackages (luapkgs: with luapkgs; [penlight http myHashings rapidjson]); in pkgs.mkShell { packages = [ myLua.pkgs.fennel myLua pkgs.findutils # inotify-tools ]; }

Firing this up

It's easy enough to get a repl running with that nix-shell from this document, or from there with shell:nix-shell ~/Code/koreader-to-org/shell.nix. This relies on a patch to fennel-mode for now...

emacs-lisp source: 
(setq fennel-program "nix-shell /home/rrix/Code/koreader-to-org/shell.nix --run \"fennel --repl\"") (fennel-repl nil)

But with a REPL we can get a little bit weird. Without a repl I can execute fennel command.fnl to execute all the code in this document.

Libraries and Locals

This makes use of Penlight, a toolbox of Lua functions and patterns. It's included in the Lua distribution created by the Nix Shell above. It also uses lua-http to query Wallabag .

fennel source: :tangle ~/Code/koreader-to-org/command.fnl
(local file (require :pl.file)) (local lapp (require :pl.lapp)) (local path (require :pl.path)) (local pretty (require :pl.pretty)) (local stringx (require :pl.stringx)) (local tablex (require :pl.tablex)) (local text (require :pl.text)) ; (local wallabag (require :wallabag)) ; (local api-prefix wallabag.api-prefix) (local sha256 (require :hashings.sha256))

Including unpackaged dependency for sha256 sum in Nix Shell

I want to include two packages which aren't in nixpkgs ... I should fix that, but for now here they are:

nix source: :tangle ~/Code/koreader-to-org/pkgs.nix :mkdirp yes
{ pkgs, lua }: rec { nums = lua.pkgs.buildLuarocksPackage { pname = "nums"; version = "20130228-2"; knownRockspec = (pkgs.fetchurl { url = "https://luarocks.org/manifests/user-none/lua-nums-scm-1.rockspec"; sha256 = "sha256-fxfcfiAgGGRhyCQZYYdUPs/WplMWVZH4QEPRlSW53uE="; }).outPath; src = pkgs.fetchFromGitHub { repo = "lua-nums"; owner = "user-none"; rev = "fef161a940aaafdbb8d9c75fe073b8bb43152474"; sha256 = "sha256-coI8JHMx+6sikSndfbUIuo1jutHUnM3licI2s7I7fmQ="; }; disabled = with lua; (lua.pkgs.luaOlder "5.3") || (lua.pkgs.luaAtLeast "5.5"); meta = { homepage = "https://github.com/user-none/lua-nums"; description = "Pure Lua number library providing BigNum and fixed width unsigned integer types"; license.fullName = "MIT"; }; }; hashings =lua.pkgs.buildLuarocksPackage { pname = "hashings"; version = "20130228-2"; knownRockspec = (pkgs.fetchurl { url = "https://luarocks.org/manifests/user-none/lua-hashings-scm-1.rockspec"; sha256 = "sha256-SGx6kYhigTCmJQr/lFW6TARpM3na18M8lzgIDcOiCg0="; }).outPath; src = pkgs.fetchFromGitHub { repo = "lua-hashings"; owner = "user-none"; rev = "89879fe79b6f3dc495c607494126ec9c3912b8e9"; sha256 = "sha256-/YagiUKAQKtHicsNE4amkHOJZvBEpDMs0qVjszkYnw4="; }; disabled = with lua; (lua.pkgs.luaOlder "5.3") || (lua.pkgs.luaAtLeast "5.5"); propagatedBuildInputs = [ lua nums ]; meta = { homepage = "https://github.com/user-none/lua-hashings"; description = "Pure Lua cryptographic hash library"; license.fullName = "MIT"; }; }; }

Collecting the Files

I didn't find an easy way to glob-match files like find can do (though pl.dir should get us close enough eventually, i'm lazy and hacking for now) so we'll cheat and use find 😜 with that command from earlier. This function collect-highlights gets passed a function converter-fn which is called with each metadata.epub.lua or whatever match-name is set to, both loaded and the file name. the converter-fn used in the EPUBs' highlights may be different than for PDFs maybe...

posix-egrep is used so that I can match things like (mobi|epub) ...

 collect-highlightsfennel source: :tangle ~/Code/koreader-to-org/command.fnl
(fn collect-highlights [base-path match-name converter-fn] (let [proc (io.popen (.. "bash -c 'find " base-path " -regextype posix-egrep -regex \"" match-name "\" -type f'"))] (local output []) (each [line (proc:lines)] (table.insert output (converter-fn ((loadfile line)) line))) output))

The stub data is returned in a structure with some simple metadata at (datastructures)... There is probably some other interesting doc properties I could add, should add, will add. Filename, too?

fennel source: :tangle ~/Code/koreader-to-org/command.fnl
(fn init-datastructure [md-path from-md] (let [props (?. from-md :doc_props) stats (or (?. from-md :stats) {})] { "path" md-path "authors" (.. (?. props :authors)) "title" (?. props :title) "series" (?. props :series) "md5" (or (?. stats :md5) (. from-md :partial_md5_checksum)) "highlights" []}))

The entries are sorted by their date time (for EPUBs) or by page number (for PDFs) within the chapter, but the chapters are lexically-sorted within the parent datastructure, so it'll all have to be sorted again when rendering... hmm. I'd like to sort EPUBs based on the XPath in the locs metadata, but lexically sorting those without parsing them is sort of infeasible since [10] sorts after [2].

fennel source: :tangle ~/Code/koreader-to-org/command.fnl
(fn sort-by-datetime [first second] (< (. first "datetime") (. second "datetime"))) (fn sort-by-page-no [first second] (< (. first "page") (. second "page")))

NEXT swap collect-highlights to use pl.dir

Parsing Koreader metadata in to a kludged together tree structure

The metadata are basically the same, with some minor structure differences in the highlighter.

These could probably be a lot "prettier" if they were destructuring-based data-structure construction, but that can come in a future refactor. For now the goal is to move the data, basically in to a [book -> chapter -> list of highlights] topology mirroring the hierarchy of the org-mode document i want to dump.

process-one-book receives the metadata as directly loaded from the metadata.epub.lua file, and the file name it was loaded from. What we want to get out of this is an object restructured from the metadata's bookmarks key structured by the chapter they are indexed by, ordered by the earliest highlight in that chapter. This is not so difficult, just finicky. don't forget to return the output structure we are populating!

fennel source: :tangle ~/Code/koreader-to-org/command.fnl :noweb yes
(fn find-chapter-by-name [highlights name] (tablex.find_if highlights (lambda [v] (= (. v :name) name)))) (fn process-one-book [metadata file-name] (let [bookmarks (or (. metadata "bookmarks") {})] (local output (init-datastructure file-name metadata)) ;; (print "trying..." file-name) (print "processing..." (?. (?. metadata "doc_props") "title")) (if (?. (?. metadata "doc_props") "title") (do (var sum (accumulate [total 0 _i1 inner-tbl (tablex.sortv bookmarks sort-by-datetime)] ;; (do (let [chapter (or (?. inner-tbl "chapter") "") chapter-idx-maybe (or (find-chapter-by-name output.highlights chapter) (do (table.insert output.highlights {:name chapter}) (length output.highlights))) ] (fn mk-highlight [chapter highlight-tbl] { "datetime" (?. highlight-tbl "datetime") "chapter" chapter "locs" [(?. highlight-tbl "pos0") (?. highlight-tbl "pos1")] "text" (or (?. highlight-tbl "text") (?. highlight-tbl "notes"))}) (table.insert (. output.highlights chapter-idx-maybe) (mk-highlight chapter inner-tbl))) (+ total 1)))) (print "parsed" sum "highlights") output) ;; XXX I sure hope this works lol! (do (print "Skipping stub metadata") {:path file-name}) )))

find-chapter-by-name takes the sequential table of highlights and returns one with a matching :name property using tablex. It's embedded in the process-one-book function instead of being hoisted up mostly because I am lazy and don't want to hoist all the requirements to global scope.

 find-chapter-by-namefennel source: :noweb-ref find-chapter-by-name
(fn find-chapter-by-name [highlights name] (tablex.find_if highlights (lambda [v] (= (. v :name) name))))

The meat of the process is a loop over the sorted bookmarks. In file:::table-sort-fns above there is a function which will blindly string sort one of the bookmark entities based on the datetime property. Replaceing this with a real datetime sort or even by the XPath of the note is doable but these datetimes already lexically sort well enough. And so in (iterate-bookmarks), they are sorted lexically by datetime and then an "inner table" is processed. A chapter object is instantiated in file:::get-chapter-idx and each highlight is inserted in to that chapter in file:::mk-highlight

fennel source: :noweb-ref process-bookmark-table :noweb yes
(let [chapter (or (?. inner-tbl "chapter") "") chapter-idx-maybe <<get-chapter-idx>> ] <<mk-highlight>> (table.insert (. output.highlights chapter-idx-maybe) (mk-highlight chapter inner-tbl)))

The chapter is either found or inserted anew. Note that instantiating this with a :name key makes this "technically" not a sequential table, care will have to be taken when iterating over it. Probably makes sense to stuff the name in to a metatable eventually.

fennel source: :noweb-ref get-chapter-idx
(or (find-chapter-by-name output.highlights chapter) (do (table.insert output.highlights {:name chapter}) (length output.highlights)))

And each highlight is constructed from the metadata table like so; the locations for PDF and EPUB are subtly different since one has HTML XPaths and one has ... well, rectangles. That will be an issue when we render them, but for now these can be "the same"-ish... Maybe just include a page-reference in the final render? These highlights get fed, ultimately, to file:::render-one-highlight

fennel source: :noweb-ref mk-highlight
(fn mk-highlight [chapter highlight-tbl] { "datetime" (?. highlight-tbl "datetime") "chapter" chapter "locs" [(?. highlight-tbl "pos0") (?. highlight-tbl "pos1")] "text" (or (?. highlight-tbl "text") (?. highlight-tbl "notes"))})

A sample chapter will look like this according to fennel's prettyprinter. PDFs will look similar but the location elements will have more information:

fennel source: 
{1 {:chapter "Chapter 11" :datetime "2021-05-20 11:36:06" :locs ["/body/DocFragment[154]/body/p[154]/text()[1].0" "/body/DocFragment[154]/body/p[154]/text()[4].107"] :text "The spy could hardly tightbeam the Uriel with the dangerous news that the crew of Father Captain de Soya’s Raphael had been going to confession too frequently, but that was precisely one of the causes of Liebler’s concern"} 2 {:chapter "Chapter 11" :datetime "2021-05-20 13:04:57" :locs ["/body/DocFragment[154]/body/p[158]/text()[1].234" "/body/DocFragment[154]/body/p[158]/text()[1].389"] :text "The crew did not like Hoag Liebler—he was used to being disliked by classmates and shipmates, it was the curse of his natural-born aristocracy, he knew—but"} :name "Chapter 11"}

Rendering my Kludged Datastructure to Org

Each book comes out of process-one-ebook looking kind of like this:

fennel source: 
{:authors "Nikole Hannah-Jones The New York Times Magazine Nikole Hannah-Jones The New York Times Magazine" :highlights {"Preface: Origins by Nikole Hannah-Jones" [{:chapter "Preface: Origins by Nikole Hannah-Jones" :datetime "2022-01-17 00:02:03" :locs ["/body/DocFragment[8]/body/div/p[12]/text()[1].284" "/body/DocFragment[8]/body/div/p[12]/text()[3].6"] :text "I was starting to figure out that the histories we learn in school or, more casually, through popular culture, monuments, and political speeches rarely teach us the facts but only certain facts"} {:chapter "Preface: Origins by Nikole Hannah-Jones" :datetime "2022-01-17 00:03:57" :locs ["/body/DocFragment[8]/body/div/p[15]/text()[1].0" "/body/DocFragment[8]/body/div/p[15]/a/text().1"] :text "School curricula generally treat slavery as an aberration in a free society, and textbooks largely ignore the way that many prominent men, women, industries, and institutions profited from and protected slavery.6"} {:chapter "Preface: Origins by Nikole Hannah-Jones" :datetime "2022-01-17 00:04:46" :locs ["/body/DocFragment[8]/body/div/p[16]/text()[3].1" "/body/DocFragment[8]/body/div/p[16]/a[2]/text().1"] :text "Even educators struggle with basic facts of history, the SPLC report found: only about half of U.S. teachers understand that enslavers dominated the presidency in the decades after the founding and would dominate the U.S. Supreme Court and the U.S. Senate until the Civil War.8"}]} :title "The 1619 Project: A New Origin Story"}

It's the job of the rest of the doc to take the "tree" datastructure present in :highlights along with the other metadata stored at the book level and write those to disk as org-mode .

Aside: using Dead Simple Wallabag Fennel client to derive the URLs of read-it-later files

So with that tiny little dogshit API client, i can uhh, capture a wallabag access token, and make a single API request with it to extract links to documents sent from my browser or phone to koreader via its built in plugin. Koreader downloads the stories with a known file name [w-id_XXXX] where the XXXX is the ID of the story, so we fetch that and cram it in to the datastructure here.

maybe-update-book-md-with-wallabag is called below in =render-one-book= .

fennel source: :tangle ~/Code/koreader-to-org/command.fnl :noweb yes
;; (local wallabag-token (->> ".wallabag" ;; (wallabag.load-client-credentials) ;; (wallabag.get-token (.. api-prefix "/oauth/v2/token")))) ;; (fn get-single-entry-from-wallabag [id] ;; (let [(headers body) (wallabag.api-req wallabag-token "GET" (.. api-prefix "/api/entries/" id ".json"))] ;; body)) ;; (fn get-wallabag-url [id] ;; (. (get-single-entry-from-wallabag id) :url))

Templating the org documents

I use penlight's pl.text.Template to render the tables that come out of process-one-book to strings, and smash them all together with the stringx module. These should be more-or-less self-evident based on the structure spat out by the collectors. Level 0 is the book, Level 1 is the chapter, Level 2 is the highlight within the chapter.

fennel source: :tangle ~/Code/koreader-to-org/command.fnl :noweb yes
(local template (. text :Template)) <<render-one-highlight>> <<render-one-chapter>> <<render-one-book>>

We use noweb syntax to make sure the functions are defined in the correct order.

render-one-book

Each book is its own org-roam document.

Books get their canonical ID from the md5 sum stored in the file.

fennel source: :noweb-ref render-one-book
(local book-tmpl (template ":PROPERTIES: :ID: koreader-${md5} :ROAM_REFS: \"${path}\" :END: #+TITLE: Notes from ${title} #+AUTHORS: ${authors} [[${path}][${path}]] ")) (fn munge-book-path [book-md] (let [path (. book-md :path) (_ _ bag-id) (string.find path "%[w%-id_(%d+)%]")] (set book-md.path (.. "file:" (string.gsub path "sdr/metadata.([^.]+).lua" "%1"))) ;; (if bag-id ;; (do ;; (print "bag id" bag-id) ;; (set book-md.path (get-wallabag-url bag-id))) ;; ;; sickos.jpg ;; (set book-md.path (.. "file:" ;; (string.gsub path "sdr/metadata.([^.]+).lua" "%1")))) )) (fn render-one-book [book] (let [authors (?. book "authors") title (?. book "title")] (munge-book-path book) (.. (: book-tmpl :substitute book) (stringx.join "\n" (icollect [_i1 chapter-hls (pairs (?. book "highlights"))] (render-one-chapter chapter-hls))))))

render-one-chapter

A book contains multiple chapters, this is a level-1 heading basically just used for organization; we generate the Chapter template and cram a sequential table in to it of each highlight's text. There is a surprise here, we copy the chapter table so that the :name added in (get-chapter-idx) (get-chapter-idx) in the collectors is removed before collecting each highlight.

fennel source: :noweb-ref render-one-chapter
(local chapter-tmpl (template "* ${chapter} ")) (fn render-one-chapter [chapter] (let [name (?. chapter :name) copy (tablex.deepcopy chapter)] (tset copy :name nil) (stringx.join "\n" (tablex.insertvalues [(: chapter-tmpl :substitute {:chapter name})] (icollect [_i2 hl (ipairs copy)] (values (render-one-highlight hl)))))))

render-one-highlight

This uses the hashings library which I import at the top to generate a unique ID for each highlight based on the datetime which I took the note, and the note itself. Should surely be high enough entropy + stable. I sure hope so!

The text goes through some gsub functions so that special characters are escaped. Other transformations of the notes could happen here.

 render-one-highlightfennel source: :noweb-ref render-one-highlight
(local highlight-tmpl (template "** ${text} :PROPERTIES: :ID: ${id} :LOC0: ${loc0} :LOC1: ${loc1} :PAGE: ${page} :END: [${datetime}] ")) (fn render-one-highlight [hl] (let [locs (. hl :locs) digest (: sha256 :new (.. (. hl :text) (. hl :datetime))) hexdigest (: digest :hexdigest) fields (tablex.update { :datetime (. hl :datetime) :text (-> (. hl :text) (: :gsub "%$" "$ ") (: :gsub "\n" "¶ ") ) :id hexdigest } (if (= (type (. locs 1)) "table") { :page (or (?. hl :page) (?. (. locs 1) :page) "") :loc0 "" :loc1 ""} { :page "" :loc0 (or (. locs 1) "") :loc1 (or (. locs 2) "")}))] (: highlight-tmpl :substitute fields)))

Rendering the templates to a file

Get one book by invoking process-one-book and then write it to file. The entire "flow" is built around this function and the metadata collectors below.

  • Find all of the epubs or so by wrapping =collect-highlights= with the metadata collectors below

  • Each metadata file is:

    • parsed in to tables, and fed in to =process-one-book= to get a normalized tree structure

    • Call =render-one-book= to create a string from the tree structure

    • check whether the metadata file is edited more recently than the notes and if render it in maybe-write-to-file below.

fennel source: :tangle ~/Code/koreader-to-org/command.fnl :noweb yes
<<maybe-write-to-file>> (fn write-one-book [book out-dir] (let [book-path (?. book :path) title (or (?. book :title) book-path) notes-path (path.join out-dir (.. (: title :gsub "[:/\\ ?]" "_") ".org")) high-cnt (accumulate [sum 0 i chap (ipairs (or (?. book :highlights) {}))] (+ sum (length chap)))] (print "Maybe rendering" book-path) (print "to" notes-path) (when (> high-cnt 0) (maybe-write-to-file book-path notes-path (lambda [] (print "rendering" high-cnt "highlights to string") (render-one-book book)))))) (fn write-one-book-from-md [md filename out-dir] (let [book (process-one-book md filename)] (write-one-book book out-dir) book))

Helper to Write books to Files

This is a helper function which accepts a source file for modification detection at (mod-test) -- if there either are no notes file, or it's older than the book metadata, it will call render-fn to return a string of the book notes, and then write that to dest-file. Passing in a lambda allows this to be a "lazy" helper, we only want to render the note template if the file has been modified.

fennel source: :noweb-ref maybe-write-to-file
(fn maybe-write-to-file [src-file dest-file render-fn] (let [dest-file2 (path.expanduser dest-file) book-mtime (file.modified_time src-file) notes-mtime (file.modified_time dest-file2)] (print "Targeting..." dest-file2) (if (or (not notes-mtime) ;; (ref:mod-test) (< notes-mtime book-mtime)) (match (io.output dest-file2) (nil msg) (print "Could not write file... " msg) f (let [text (render-fn)] (io.write text) (io.close f) (print "Rendered..." (length text)))) (print "Skipping ... " dest-file2))))
DONE re-structure this document so that the books are parsed as they're rendered

rather than having all of them parsed in Invoking the collectors , then rendered here, delay the parsing.

NEXT bubble modtime check further up in to (write-one-book-from-md)

this would be so that the parsing of the notes does not even happen if the koreader source has not been modified.

Koreader EPUB Metadata Collector

process-one-book is wrapped with collect-highlights collect-highlights in to a simple interface which can be used to collect all the epubs' highlights in to one big ol' table.

fennel source: :tangle ~/Code/koreader-to-org/command.fnl
(fn collect-epub-highlights [books-path out-path] (collect-highlights books-path ".*sdr/metadata.epub.lua" (lambda [book book-path] (write-one-book-from-md book book-path out-path))))

Koreader PDF Metadata Collector

The PDF notes are a bit differently shaped than the epubs', but basically close enough... They're sorted by page number. Just gotta grab the correct files!

fennel source: :tangle ~/Code/koreader-to-org/command.fnl
(fn collect-pdf-highlights [books-path out-path] (collect-highlights books-path ".*sdr/metadata.pdf.lua" (lambda [book book-path] (write-one-book-from-md book book-path out-path))))

Invoking the collectors

So each of those collectors will go all the way down to the file-system. This is basically the "entrypoint" of the app:

fennel source: :tangle ~/Code/koreader-to-org/command.fnl
(let [default-book-dir "/storage/emulated/0/reMarkable/" default-note-dir "~/org/highlights/" args (lapp (stringx.join "\n" ["Parse koreader metadata files in to org-mode notes" "" "-f,--file (optional string) only parse one metadata.*.lua file" (.. "-e,--epubs (optional string) parse epubs from here, otherwise " default-book-dir) (.. "-p,--pdfs (optional string) parse pdfs from here, otherwise " default-book-dir) (.. "-n,--notes (optional string) write outputs to directory: " default-note-dir) ""])) file-name (?. args :file) file? (not (not file-name)) epub-dir (or (. args :epubs) default-book-dir) pdf-dir (or (. args :pdfs) default-book-dir) notes-dir (or (. args :notes) default-note-dir)] (if file? (write-one-book-from-md ((loadfile file-name)) file-name notes-dir) (do (collect-epub-highlights epub-dir notes-dir) (collect-pdf-highlights pdf-dir notes-dir))))

NEXT many entries don't have correct metadata...

NEXT generate a TOC

Note Index

emacs-lisp source: :results drawer
(org-roam-db-sync) (->> (org-roam-db-query [:select [id title] :from nodes :where (like file "%/org/highlights/%") :and (= level 0)]) (-map (pcase-lambda (`(,id ,title)) (format "- [[id:%s][%s]]" id title))) (s-join "\n"))
- [[id:koreader-51c82ccd4509b76775966645948ef6f2][Notes from On Being Indispensable at Work]]
- [[id:koreader-faa9d9e526d3d5fea63ef0f0fbf40532][Notes from ERROR: Error reading EPUB format]]
- [[id:koreader-a046600deb309070697b799187602ca0][Notes from Get to the Point!]]
- [[id:koreader-37b4e2365ea0deed23308e550a1109f5][Notes from The Path of Love]]
- [[id:koreader-2b4c5c6da9b85e00a9c19757af2ca3dc][Notes from Where Cooking Begins: Uncomplicated Recipes to Make You a Great Cook]]
- [[id:koreader-7d00052d05dcdbee2f946e689a3de092][Notes from The Food Lab: Better Home Cooking Through Science]]
- [[id:koreader-317dd3db02269b00e9a592bbfb111970][Notes from Borne]]
- [[id:koreader-477e429de78c1e19d44f4002aa0b97de][Notes from Tales from Earthsea]]
- [[id:koreader-4e665514e08156bb68f450b76b538567][Notes from The 1619 Project: A New Origin Story]]
- [[id:koreader-8d00e8dcb7021c509fc31aa0a6acf0ae][Notes from Debt: The First Five Thousand Years]]
- [[id:koreader-20e6c3becc335b2a44526fae064ab389][Notes from The Codeless Code]]
- [[id:koreader-1f7b09b321466a2105aeda80c8786451][Notes from The JWT Handbook]]
- [[id:koreader-590597cb9fef8368a582edb5b4b212fd][Notes from 'Shared Data,' a short story from an alternate future]]
- [[id:koreader-a845e7db1da3f1adffa0cf07c1087c78][Notes from The Artist's Way]]
- [[id:koreader-3ddbe5750023e72ae83f87e0f690dae9][Notes from A Prayer for the Crown-Shy: A Monk and Robot Book]]
- [[id:koreader-2180bbf59e7a3e7117e724ef18527d4b][Notes from Stories of Your Life and Others]]
- [[id:koreader-18c70076fa09f7fee86e86b142051124][Notes from The Platform Sutra: The Zen Teaching of Hui-Neng]]
- [[id:koreader-f93a3d1de173da4ab3b100c11570e9cc][Notes from The Politics of Bitcoin: Software as Right-Wing Extremism]]
- [[id:koreader-9b039130b2387b5a00bd046da1109f99][Notes from The Penguin Book of Japanese Short Stories]]
- [[id:koreader-b04285c9a577c72a9906789265d257ed][Notes from Zen Mind, Beginner's Mind]]
- [[id:koreader-e420d1ca83e781d2631da5d140141e79][Notes from Starting FORTH: An Introduction to the FORTH Language and Operating System for Beginners and Professionals]]
- [[id:koreader-e1e155a8580329582c020d97fe8f1294][Notes from The Garden of Flowers and Weeds]]
- [[id:koreader-d70ac0109aec1b0eeefb7fc2b16c0ad3][Notes from Worlds Enough & Time]]
- [[id:koreader-b944d2d39dc3e0605a14e0fc4040ce2b][Notes from Why We Need to Study Nothing]]
- [[id:koreader-eb30eb4b117969036f38f965adc84b6d][Notes from Altered Traits]]
- [[id:koreader-fe23fdaf1fbf23d2149eb4d61c6334e6][Notes from Get In The Robot!_v1.0]]
- [[id:koreader-28077f64b973c74a2953c4ec903c3506][Notes from Can Nuclear Power Go Local?]]
- [[id:koreader-f1c5d9cb57ff885737f09606b556a425][Notes from Pedal Stretch Breath]]
- [[id:koreader-b37b1b0529ef5cf1fde6cda6ebb49fb3][Notes from Neoliberalism]]
- [[id:koreader-0b771a2c4b161617ba1f0c5820c77670][Notes from End of an era of ad targeting]]
- [[id:koreader-d1a6491ef20bd59663cf5f555dd337cc][Notes from Practical Zen: Meditation and Beyond]]
- [[id:koreader-f219176f5d90379b0e63d883cfd5e280][Notes from The Book of Joy: Lasting Happiness in a Changing World]]
- [[id:koreader-eab9b5b4161b04a2c93bf2c83e84ee0f][Notes from A General Overview of What Happens Before main() - Embedded Artistry]]
- [[id:koreader-44b66de71ce64752b5bc11dd2018a4bd][Notes from The Market Gardener: A Successful Grower's Handbook for Small-Scale Organic Farming]]
- [[id:koreader-68d1674c2b33d7e46bce904a8dbb94ae][Notes from ground_itself_pages_full]]
- [[id:koreader-bcc4947392c140821161e42ece945b56][Notes from Nonviolent Communication: A Language of Life]]
- [[id:koreader-95703c5eb14dc8e03042d198229a734e][Notes from An Overview of Global Wayfinding Meditation]]
- [[id:koreader-d41d8cd98f00b204e9800998ecf8427e][Notes from [w-id_1715] The Future is Vast_ Longtermism’s perspective on humanity’s past_ present_ and future]]
- [[id:koreader-fc3d4137aa4e0c4274529f7f71be8248][Notes from Authority: A Novel]]
- [[id:koreader-19ce639fd80c76ba8ec6ef5d33751474][Notes from Gödel, Escher, Bach: An Eternal Golden Braid]]
- [[id:koreader-b854099e0261f728da824dca2f284233][Notes from Capitalism & Disability]]
- [[id:koreader-2f5f92793c615f710fc924b370a56d48][Notes from The Wok: Recipes and Techniques]]
- [[id:koreader-22a125034c8bf0a24981778a55b39ce7][Notes from The Utopia of Rules]]
- [[id:koreader-c2ae34edad63bbdc180f0245edf0e717][Notes from From Mindfulness to Heartfulness: Transforming Self and Society with Compassion]]
- [[id:koreader-ae9ae2ae1cceeaeb99f713213b31b4b3][Notes from Shortcut to female voice]]
- [[id:koreader-48b3af02c6fa0afef6ca812f62778531][Notes from Zen Baggage]]
- [[id:koreader-0ff7034098ef1ab6fe65a73c8100be6d][Notes from A History of Ancient and Early Medieval India: From the Stone Age to the 12th Century]]
- [[id:koreader-139652aed5835de6ba33868a37013b13][Notes from The Heart Sutra]]
- [[id:koreader-5adf337b5ce80a0f8607858eb0f38dc7][Notes from The Hyperion Cantos 4-Book Bundle]]
- [[id:koreader-760e5e86f2cae73e0fd492b98decfd03][Notes from lasers_and_feelings_rpg]]
- [[id:koreader-d7bd6a56f6d5476c0c3af4b69317584a][Notes from Philosophy: A Very Short Introduction]]
- [[id:koreader-5f2ec4c52a14dd3095d41f72c1cc61c5][Notes from Annihilation]]
- [[id:koreader-e189389e60c609c212202d30daa168d0][Notes from Wizard of Earthsea (9780544084377)]]
- [[id:koreader-c44debe11c5199dd1dc9edbf8f4bc0d6][Notes from The Old Tea Seller: Life and Zen Poetry in 18th Century Kyoto]]
- [[id:koreader-0b868c20d2ee3656aab08c1ae69ede24][Notes from Wisdom of Insecurity]]
- [[id:koreader-403b120e0eb8f849f4441ba2a6d7d7a7][Notes from Mastering the Core Teachings of the Buddha: An Unusually Hardcore Dharma Book - Revised and Expanded Edition]]
- [[id:koreader-0dbfc0b87d62781219557fe35491ca87][Notes from The Tombs of Atuan]]
- [[id:koreader-244676434e6540b167815c8831cf0070][Notes from A Psalm for the Wild-Built]]
- [[id:koreader-da3ea64aea391affe05aa48efb19abe6][Notes from Beans]]
- [[id:koreader-cda3d1abc4acc3ec430f34516f725a08][Notes from A Long Night in The Mech Bay v1_2]]
- [[id:koreader-e102d195d011906a852c958c9f7d8a4f][Notes from Nix Pills]]
- [[id:koreader-a504c02730110bd6ef25f7c02bd33625][Notes from On Inflation: It's the Monopoly Profits, Stupid]]
- [[id:koreader-cdb8efb1057b03f02a9c5fdc63c31b87][Notes from The Diamond Sutra: The Perfection of Wisdom]]
- [[id:koreader-c9604ca28d6fb22b80e0bf348e3e5e03][Notes from Mexican Gothic]]
- [[id:koreader-45d29125ecfd7affcf02727697461f56][Notes from How to Do Nothing: Resisting the Attention Economy]]
- [[id:koreader-d58b23a3daa758fd51df18ae045fa15d][Notes from If the News is Fake, Imagine History | The Network State]]
- [[id:koreader-573f7a8740b5d511199a5e013e535e4b][Notes from Acceptance]]
- [[id:koreader-393d84388e2c3b1010240163c6b5a7dc][Notes from Privacy at the Margins]]
- [[id:koreader-827bde8c3584afcf9118f2afad004ee2][Notes from How to Understand Your Gender]]
- [[id:koreader-34ba4f87765a4cb7bfb396d673b6f5b3][Notes from Big Ball of Mud]]
- [[id:koreader-056399435f05056282be954cc594c5b4][Notes from The OpenID Connect Handbook]]
- [[id:koreader-516abcbad6b8452a4ce510248e423579][Notes from ‘They are preparing for war’: An expert on civil wars discusses where political extremists are taking this country]]
- [[id:koreader-662db0b8dd381fcbc033367ee93145cd][Notes from Thinking Forth]]
- [[id:koreader-feefe3a2ce4e3c74f624e20b1c4e01b0][Notes from The Lankavatara Sutra: Translation and Commentary]]
- [[id:koreader-da0ea50ad6dcdaf04b3a465fc8afd6e3][Notes from Introduction to Zen Training]]
- [[id:koreader-aeeff52480c3289c754ac9190eaaf297][Notes from She Comes First]]
- [[id:koreader-f25f7ddd9739dd89eee7338ee6637efa][Notes from Tao te Ching]]
- [[id:koreader-3ce19228de82a86f35a019f7d9ef96f9][Notes from By Steppe, Desert, and Ocean: The Birth of Eurasia]]
- [[id:koreader-1e5dbb412ddb08195212b63965a41a1d][Notes from This Is How You Lose the Time War]]
- [[id:koreader-4b29ba84175a8582a6eccf71fc30ef9c][Notes from 45 days of silence - meditating 16 hours a day for 45 days]]
- [[id:koreader-994e5f35e65f736b7a96afdccb2a0f92][Notes from INTROSPECT]]
- [[id:koreader-35e8b8d8413e6e77b60c0cd7192c1d87][Notes from Nixonland]]
- [[id:koreader-de9bb8e989f4a1fa26542385f40adfdc][Notes from Friendly Ambitious Nerd v1.0]]
- [[id:koreader-e3216191554232b29aa0ff59478e3898][Notes from Beowulf: A New Translation]]
- [[id:koreader-6642267c40bb43e2dc103bca11d14f1e][Notes from Stepping Out of Self-Deception]]
- [[id:koreader-70dce22eed023a838512fda066513e47][Notes from Saving Time: Discovering a Life Beyond the Clock]]
- [[id:koreader-3fef347d29b8d5dfefb6c27ecaa2cf03][Notes from ECSx: A New Approach to Game Development in Elixir]]
- [[id:koreader-88736d244febe0e9f10d70830f12d70d][Notes from Buddhist Ethics: A Very Short Introduction]]
- [[id:koreader-716d45ec388a0946800df25594b9ab4d][Notes from Thinking Without a Banister]]

Footnotes

fn1

They're not included in my Archive though, that is they're unlinked to the broader "sphere of thinking" and from my auto-complete database -- they don't have IDs in the org-roam database, they aren't visible to Arroyo . I choose to work them in manually from the roam:Note Index page. This is one of My Living Systems which I use to roam:Remember Anything I Read .