Tracking individual objects with blinking-coded LEDs

Dear all,

I would like to track different moving objects on our theater-stage. Imaging it like ~15 toy cars. The idea is that each ‘car’ has an infrared LED on top. The LED blinks in a unique pattern. A camera at the ceiling sees all the blinking LEDs, the system recognizes the blinking-code and can tell me, where each car is at the moment.
Its basically the same system the Oculus Rift uses to track the headset.

Do you have anys ideas where to start? I guess I’m missing the right keywords for a successful search…

Thank you and all the best, Julez

cannot provide any keywords, but i think this should be a classic example, where VL.OpenCV could shine:

  • detect areas in image where blinking occurs
  • observe the blinkings over time
  • detect the pattern


  • how to detect where the blinkpattern starts.
  • are there potential problems with the blink-duration interfering with the cameras framerate?
  • can you do initial detection and then use an image tracker to follow the robot over time or do you need continuous detection

keep us posted on how you proceeded, if there isn’t something ready out there for this kind of problem, this could become a textbook example…

Thanks for your respond, joreg.
I had some more thoughts on it before getting into prototyping:

  • as the speed of the ‘cars’ isn’t that fast, a relativly slow blinking should be fine and the framerate of the camera shouldn’t be a problem anymore. (I found this interesting blogpost on reverse-engineering the oculus rift and they actually sync the framerate with a fast 10-bit blinking pattern.)
  • i thought about storing the blinking-pattern in an boolean array, do a 1D FFT on it and compare it to smiliar patterns. This is just a thought and I would check some basic gesture-recognizer for the comparision. Maybe I’m totally wrong with this :).
  • I read about “particle tracking” systems to predict the direction for not loosing the blobs when the LEDs turn of. Maybe thats a way?

And I had another, not-so-elegant-but-way-simpler-idea:

  • In any case, all cars are controlled via WiFi and and esp8266.
  • The Master-PC, who does the tracking, has an Initialize-Routine: It turns on one LED after another and stores the location and an ID for each car. After that, all LEDs turn on, the Master-PC still knows which car is which and can continue with a quite standardised tracking solution.
  • We might loose some functionality with this, but might be a quicker way…

I’ll keep you posted!
Best, Julez

@clockdivider here are some thoughts I had on how to do this with VL.OpenCV, there are likely many ways to go about it but maybe you get some ideas out of this:

To recognize pattern start/end maybe use some pattern like 2 flashes initially in a set amount of time indicating start of pattern (same for all units) and after that, say .5 secs after the pattern begins and this repeats over and over.

I presume with some harsh thresholding you could block everything out except for the super bright areas (LEDs on)

That plus some Erode should give single white blobs which can be tracked with the Blob tracking.

Also CountNonZero after the erode, knowing the expected average pixel size per blob can help you count blinks no?

As for losing tracking of blobs when LEDs are off, you could negate the whole thing and have the LEDs be on all the time and “blink” by turning them off. You could also do something like using an average image of the las few frames for blob tracking (no off states should make a big difference in the average image so you should always be able to track the blob) while using a non-averaged image to count the blinks.

Hope some of that makes sense to you. Good luck!


IIRC @oschatz did something similar @MESO – using blinking LEDs for projection mapping – quite some time ago. Don’t know if he is still reading the forum though.


This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.