Speed up Data.Unique
The current Data.Unique
code seems heavier than necessary:
- It uses
Integer
when it can surely get away with less than two words on 64-bit systems. - It effectively guarantees that uniques will be consecutive, which isn't very useful.
I don't know how to fix this, but I'm confident there's a better way out there.
One silly idea: use one 100 or so-bit counter per capability. Use the rest of the bits to distinguish among the capabilities. To reduce hash collisions, make each capability increment its counter by a different prime number (or something like that).
Trac metadata
Trac field | Value |
---|---|
Version | 8.4.3 |
Type | Bug |
TypeOfFailure | OtherFailure |
Priority | normal |
Resolution | Unresolved |
Component | Core Libraries |
Test case | |
Differential revisions | |
BlockedBy | |
Related | |
Blocking | |
CC | |
Operating system | |
Architecture |