<< return to Pixycam.com

User Tools

Site Tools


wiki:v2:color_connected_components

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
wiki:v2:color_connected_components [2018/05/22 01:00]
pixycam [Teach it the objects you’re interested in]
wiki:v2:color_connected_components [2018/10/29 16:29] (current)
pixycam
Line 13: Line 13:
 Pixy2 can find literally hundreds of objects at a time. It uses a connected components algorithm to determine where one object begins and another ends. Pixy2 then compiles the sizes and locations of each object and reports them through one of its interfaces (e.g. SPI). Pixy2 can find literally hundreds of objects at a time. It uses a connected components algorithm to determine where one object begins and another ends. Pixy2 then compiles the sizes and locations of each object and reports them through one of its interfaces (e.g. SPI).
  
 +{{wiki:​v2:​pixy_balls2.mp4||loop,​autoplay}}
 ==== Teach it the objects you’re interested in ==== ==== Teach it the objects you’re interested in ====
  
 Pixy2 is unique because you can physically teach it what you are interested in sensing. Purple dinosaur? Place the dinosaur in front of Pixy2 and press the button. Orange ball? Place the ball in front of Pixy2 and press the button. It’s easy, and it’s fast. Pixy2 is unique because you can physically teach it what you are interested in sensing. Purple dinosaur? Place the dinosaur in front of Pixy2 and press the button. Orange ball? Place the ball in front of Pixy2 and press the button. It’s easy, and it’s fast.
 +
  
 More specifically,​ you teach Pixy2 by holding the object in front of its lens while holding down the button located on top. While doing this, the RGB LED under the lens provides feedback regarding which object it is looking at directly. For example, the LED turns orange when an orange ball is placed directly in front of Pixy2. Release the button and Pixy2 generates a statistical model of the colors contained in the object and stores them in flash. It will then use this statistical model to find objects with similar color signatures in its frame from then on. More specifically,​ you teach Pixy2 by holding the object in front of its lens while holding down the button located on top. While doing this, the RGB LED under the lens provides feedback regarding which object it is looking at directly. For example, the LED turns orange when an orange ball is placed directly in front of Pixy2. Release the button and Pixy2 generates a statistical model of the colors contained in the object and stores them in flash. It will then use this statistical model to find objects with similar color signatures in its frame from then on.
 +
  
 Pixy2 can learn seven color signatures, numbered 1-7. Color signature 1 is the default signature. To teach Pixy2 the other signatures (2-7) requires a simple button pressing sequence. Pixy2 can learn seven color signatures, numbered 1-7. Color signature 1 is the default signature. To teach Pixy2 the other signatures (2-7) requires a simple button pressing sequence.
Line 24: Line 26:
 ==== Pixy2 "​tracks"​ each object it detects ==== ==== Pixy2 "​tracks"​ each object it detects ====
  
-Once Pixy2 detects a new object, it will add it to a table of objects that it is currently tracking. ​ It will then attempt to find the object (and every object in the table) in the next frame by finding its best match. ​ Each tracked object receives an index between 0 and 255 that it will keep until it either leaves Pixy2'​s field-of-view,​ or Pixy2 can no longer find the object in subsequent frames (because of occlusion, lack of lighting, etc.)   +Once Pixy2 detects a new object, it will add it to a table of objects that it is currently tracking ​and assign it a tracking index.  It will then attempt to find the object (and every object in the table) in the next frame by finding its best match. ​ Each tracked object receives an index between 0 and 255 that it will keep until it either leaves Pixy2'​s field-of-view,​ or Pixy2 can no longer find the object in subsequent frames (because of occlusion, lack of lighting, etc.)    
  
 Tracking is useful when you want your program to keep tabs on a certain instance of an object, even though there may be several other similar objects in the frame.  ​ Tracking is useful when you want your program to keep tabs on a certain instance of an object, even though there may be several other similar objects in the frame.  ​
Line 42: Line 45:
  
 CCs might be particularly useful for helping a robot navigate. For example, an indoor environment with CCs uniquely identifying each doorway and hallway would be both low-cost and robust. CCs might be particularly useful for helping a robot navigate. For example, an indoor environment with CCs uniquely identifying each doorway and hallway would be both low-cost and robust.
 +
 +==== Color connected components API ====
 +
 +The color connected components API can be found [[wiki:​v2:​ccc_api|here]].
 +
 +
 +==== Running color connected components in PixyMon ====
 +
 +
 +Information about running and configuring the color connected components program in PixyMon can be found [[wiki:​v2:​ccc_pixymon|here]].
  
wiki/v2/color_connected_components.1526950837.txt.gz · Last modified: 2018/05/22 01:00 by pixycam