Loading a few thousand URLs into memory, spitting them into an HTML template, and loading that page in a browser is incredibly inefficient; the load time is insane and the page is massive.
Each page should only show 100 URLs.
#Backend
- Create
read()
function that acceptsstart
,end
, andcount
arguments indicating what the last item is (implying forward iteration) or what the first item is (implying reverse iteration) and how many entries should be returned#Web UI
- Add
pageStart
,pageEnd
, andpageItemCount
URL queries toroot()
for passing toread()
asstart
,end
, andcount
values.- Add templated forward/backward buttons that pass
pageStart
,pageEnd
, andpageItemCount
to the backend as URL queries.#API
- Add URL queries to
/read
path that passstart
,end
, andcount
values toread()
which returns a JSON blob of the requested data.
Amolith referenced this ticket in commit 622b36b.
Amolith referenced this ticket in commit 7564d54.
Amolith referenced this ticket in commit eb7e22b.
Amolith referenced this ticket in commit aa07589.
Amolith referenced this ticket in commit 226986a.
Amolith referenced this ticket in commit 8428c70.
Amolith referenced this ticket in commit 2a59e83.