Replies: 1 comment
-
Hmm... actually it did not take that long on the second try. Perhaps because I commented out the checking for duplicates by sorting (necessary for traditional chording, not this approach). I am all good! |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
tl; dr My idea is to conditionally increase performance of writing to a file (i. e., with
critical
???) for larger files.I have a unique chording method I am testing. I have a program right now that generates 938,850 lines from approximately 1000 words by chunking and permuting each chunk and combining with permutations of the following chunk.
So for
about
we have['abo', 'ut']
as one combination for example and we permute each index into all possible permutations and combine each possible permutation with the next index's possible permutations (['ut',tu
]) to create something more unique (e. g.,oabtu
). The idea is one could type/smash on the keyboard words in chunks of the word rather than a code or shorthand for the word, which would name no memorizing of a chording system.I discovered the
shorthands
code path works for this type of input! 🥳.tl; dr Anyways,** it's the 938,850 lines that leave any program I write stalling for minutes and also ZipChord (or actually I do not know yet if it completes in ZipChord). My idea is to conditionally increase performance of writing to a file (i. e., with
critical
???) for larger files. Or as an option. My current file is 23. 51 MB.Beta Was this translation helpful? Give feedback.
All reactions