"Texture" should remind us that time does not have to be modeled on a unidimensional path, -- though all our software tools for time-based media assume this! it's not at all clear to me where that leads, yet. Any thoughts to start?
I believe CNMAT's tools for many years avoid this trap. In CAST and OSW we use various "time machines" to index signals. Some of these use an interesting stateless scheme I developed to bridge samplings of time (events) into continuous time (and back again). I explore our interpolation spaces frequently using stored or penned traces. Andy's OSC database work allows for efficient (timely) access to OSC recordings in a "non-linear" framework. John's recent NIME work is another interesting take on modeling that allows for various polytempo textures to be created without completely letting go of control of convergence and divergence to and from events. I would be happy to share details of this towards our collective understandings of by path(s) and textures.