o4.track (video +sensor osc data), gyro (Was: Synthesis Center / Inertial Sensor Fusion)

Great!

Mike, Can you generate data in Julian’s data structure and store them in a shared directory for us all
along with the journaled video?  Julian Stein wrote the object for journalling data.

On Nov 10, 2014, at 1:07 AM, Julian Stein <julian.stein@gmail.com> wrote:
Also included in the O4.rhyth_abs is a folder labeled o4.track. This features a simple system for recording and playing video with a synchronized osc data stream.
I’ll cc this to the SC team so they can point out those utilities on our github.

It’d be great if you can give the Signal Processing group some Real Live Data to matlab offline this week, as a warmup to Teoma + John’s data the week of Feb 15.

We must have video journaled as well, always.

I’d be interested in seeing an informal brownbag talk about Lyapunov exponents one of those mornings of the week of Feb 15, together with some analysis of the data. 

Let’s cc Adrian Freed and John MacCallum on this “gyro" thread —
Adrian’s got the most insight into this  and could help us make some actual scientific headway
toward publishable results.

My question is : By doing some stats on clouds of orientation measurements
can we get some measure of collective intention (coordinated attention)
not necessarily at any one instant of time (a meaningless notion in a relativistic world like ours) — 
but in some generalized (collective) specious present?

Let’s plan John and Teoma’s workshop hour by hour schedule this coming week at a tea?

Kristi or Garrett, or __: please let us know when the “heartbeat”  workshop weebly site is posted and linked to the Synthesis research ok?

Cheers,
Xin Wei

On Feb 6, 2015, at 12:13 PM, Michael Krzyzaniak <mkrzyzan@asu.edu wrote:

I translated Seb's sensor fusion algorithm into Javascript to be used within Max/MSP:


There was still quite a bit of drift when I tested it, but I was only using 100Hz sample rate which I suspect may have been the main issue.

Mike


 On Sat, Jan 31, 2015 at 3:45 PM, Adrian Freed <adrian@adrianfreed.com
 wrote:
  Thanks Xin Wei.
  Tt would indeed to at least develop a road map for this important work. We should bring the folk from x-io
  into the discussion because they have moved  their considerable energies and skills further into this space in 2015.
 
  I also want to clarify my relative silence on this front. As well as weathering some perfect storms last year, I found
  the following project attractive from the perspective of separating concerns for this orientation work: http://store.sixense.com/collections/stem
  They are still unfortnuately in pre-order land with a 2-3 month shipping time. Such a system would complement commercial and inertial measuring systems well
  by providing a "ground truth" ("ground fib") anchored to their beacon transmitter.  The sixense system has limited range for many of our applications
  which brings up the question (again as a separation of concerns not for limiting our perspectives) of scale. Many folk are thinking about orientation and inertial
  sensing for each digit of the hand (via rings).
 
  For the meeting we should prepare to share something about our favored use scenarios.

 On Jan 31, 2015, at 1:37 PM, Xin Wei Sha <Xinwei.Sha@asu.edu
 wrote:
 
 
  Can you — Adrian and Mike —  Doodle a Skype to talk about who should do what when to get gyro information from multiple (parts of ) bodies
  into our Max platform so Mike and the signal processing maths folks can look at the data?
 
  This Skype should include at least one of our signal processing  Phd’s as well ?
 
  Mike can talk what he’s doing here, and get your advice on how we should proceedL
  write our own gyro (orientation) feature accumulator
  get pre-alpha version of xOSC hw + sw from Seb Madgewick that incorporates that data
  adapt something from the odot  package that we can use now.
  WAIT till orientation data can be integrated easily (when, 2015 ?)
 
  Half an hour should suffice.
  I don’t have to be at this Skype as long as there’s a precise outcome and productive decision that’ll lead us to computing some (cor)relations on streams of orientations as a start...
 
  Cheers,
  Xin Wei
 
  __________


On Jan 31, 2015, at 1:27 PM, Vangelis <vl_artcode@yahoo.com wrote:

 Hello!
  Yes, there is great demand for something that works in sensor fusion for inertial sensors but I think the best way to do so is as part of o. so to benefit every inertial setup out there. It will take ages for Seb to implement it for xosc and would be an exclusive benefit. Seb's PhD is out there and I am sure he will help sharing new code for solving the problem. The question is can we do this? :)
  My warm regards to everyone!
  v


On Jan 30, 2015 6:45 PM, Adrian Freed <adrian@adrianfreed.com
 wrote:

  Hi.
  The experts on your question work at x-io. Seb Madgewick wrote the code a lot of people around the world are using for sensor
  fusion in IMU's.
  Are you using their IMU (x-OSC) as a source of inertial data?
 
  We started to integrate Seb's code into Max/MSP but concluded it would be better to wait for Seb
  to build it into x-OSC itself. There are some important reasons that this is a better approach, e.g.,
  reasoning about sensor fusion in a context with packet loss is difficult.
 
  It is possible Vangelis persisted with the Max/MSP route
 
 
  On Jan 30, 2015, at 3:01 PM, Michael Krzyzaniak <mkrzyzan@asu.edu  wrote:
 
  Hi Adrian,
 
  I am a PhD student at ASU and I work with Xin Wei at the Synthesis Center. We are interested in fusing inertial sensor data (accel/gyro/mag) to give us reliable orientation (and possibly position) information. Do you have in implementation of such an algorithm that we can use in (or port to) Max/MSP?
 
  Thanks,
  Mike