They make up a third of the integer benchmarks of Geekbench. They are weighted at least three decimal orders of magnitude higher than they should in a performance benchmark.
Yeah, and another third is image compression. Can you explain why that's more representative? I'm not saying definitively either are or aren't, but saying that they're over-weighted three orders of magnitude (1000 times!) is pretty out there. Encryption and decryption is not that unusual. Maybe it's I/O limited a lot of the same, but so is compression and decompression. Throwing out the the encryption results entirely (except where influenced by acceleration instructions) and relying entirely on some the other set to gauge overall performance doesn't seem like a step in a better direction.
When you get down to it, a benchmark isn't necessarily representative based on what specific task it accomplishes but on how similar the states it puts the CPU in are to other programs. In other words, the memory locality of reference spread, memory parallelism, mix of ALU operations and ratio between it and loads/stores, amount of branches and branch mispredictions, presence of floating point code, utilization of vectorization, and so on. But, since this stuff rarely gets analyzed outside of SPEC, we're stuck with just judging the benchmarks by what they do.