It wasn’t too long ago that having 256MBs of memory or even 1GB of disk space was a big deal. At the time, I thought “Wow! 1GB! What am I going to do with all that space?” Well, as usual, software engineers can never get enough speed or memory. When mp3s and mp4s came about, along with higher bandwidth, suddenly 1GB didn’t seem like enough for all the movies and music. Applications themselves took on more complexity and layers of abstraction, and required faster hardware and bigger hard drive, not to mention the operating system.
As one Terabyte looms ahead, the question is, what would you do with 1TB? What would you be able to do that you aren’t able to do now?
I imagine that beyond movies and music, there will be other forms of high bandwidth data that aren’t available yet. Olfactory data and tacile data should be immense in bandwidth, though currently, we do not have input devices for those senses. If personality-based artificial intelligence becomes commonplace, then that might also be something that requires a large amount of storage. Sensory information for bio implants as well as simulation of augmented and virtual reality will take more and more storage.
I’m not sure how other computing architectures would fit into this, but quantum computers might generate a lot of data due to the sheer parallelism of the architecture.