UPD : RESULTS ARE OUT!!!
| # | User | Rating |
|---|---|---|
| 1 | Benq | 3792 |
| 2 | VivaciousAubergine | 3647 |
| 3 | Kevin114514 | 3603 |
| 4 | jiangly | 3583 |
| 5 | strapple | 3515 |
| 6 | tourist | 3470 |
| 7 | dXqwq | 3436 |
| 8 | Radewoosh | 3415 |
| 9 | Otomachi_Una | 3413 |
| 10 | Um_nik | 3376 |
| # | User | Contrib. |
|---|---|---|
| 1 | Qingyu | 157 |
| 2 | adamant | 153 |
| 3 | Um_nik | 146 |
| 3 | Proof_by_QED | 146 |
| 5 | Dominater069 | 145 |
| 6 | errorgorn | 141 |
| 7 | cry | 139 |
| 8 | YuukiS | 135 |
| 9 | TheScrasse | 134 |
| 10 | chromate00 | 133 |
UPD : RESULTS ARE OUT!!!
Most blogs state that the space complexity of a trie is
where:
This is usually justified by:
Hence:
However, this is a very loose bound. Let us analyze this carefully.
At depth $$$d$$$, the number of distinct nodes is at most
because:
Assume:
Compute nodes per level:
From level 5 onward, the number of nodes is capped by $$$N$$$.
Let us count nodes:
Hence, total nodes:
(We use $$$L-4$$$ because the first 4 levels are counted separately: 1,2,3,4 → 4 levels, so remaining $$$L-4$$$ levels are “full” levels capped by $$$N$$$.)
Each node stores an array of size $$$R = 26$$$, so total memory:
Naive bound:
We compare the prefix contribution:
Compute sum:
For $$$N = 10^6$$$:
✅ True.
Hence, the naive bound overestimates memory by about 8×, because it assumes $$$N$$$ nodes at each level, while in reality early levels have far fewer nodes.
In general, the tight upper bound on the number of nodes is:
So total space:
For fixed alphabet size $$$R$$$:
| Name |
|---|


