====== Pixy2 Overview ====== {{wiki:img:pixy2_in_hand-300px.jpg}} Pixy2 is the second version of Pixy. It's faster, smaller and more capable than the original Pixy, adding line tracking/following algorithms as well as other features. Here's what we've added to Pixy2: * Pixy2 detects lines, intersections and small barcodes, intended for line-following robots * Improved framerate -- 60 frames-per-second * Tracking algorithms have been added to color-based object detection * Improved and simplified libraries for Arduino, LEGO Mindstorms EV3, Raspberry Pi and other controllers * Integrated light source And of course, Pixy2 does everything that the original Pixy can do: * Small, fast, easy-to-use, low-cost, readily-available vision system * Learns to detect objects that you teach it * Connects to Arduino with included cable. Also works with LEGO Mindstorms EV3, Raspberry Pi, BeagleBone and similar controllers * All libraries for Arduino, LEGO Mindstorms EV3, Raspberry Pi, etc. are provided * C/C++ and Python are supported * Communicates via one of several interfaces: SPI, I2C, UART, USB or analog/digital output * Configuration utility runs on Windows, MacOS and Linux * All software/firmware is open-source GNU-licensed * All hardware documentation including schematics, bill of materials, PCB layout, etc. are provided ===== How Pixy got started ===== Pixy (CMUcam5) is a partnership between the Carnegie Mellon Robotics Institute and Charmed Labs. Pixy comes from a long line of CMUcams, but Pixy got its real start as a [[https://www.kickstarter.com/projects/254449872/pixy-cmucam5-a-fast-easy-to-use-vision-sensor|Kickstarter campaign]]. It first started shipping in March of 2014 and has since become the most popular vision system in history! Pixy is funded exclusively through sales, so thank you for helping make Pixy a success! You can watch the original Kickstarter video below — it’s a good introduction! {{youtube>J8sl3nMlYxM?large}} \\ ===== Vision as a Sensor ===== If you want your robot to perform a task such as picking up an object, chasing a ball, locating a charging station, etc., and you want a single sensor to help accomplish all of these tasks, then **vision** is your sensor. Vision (image) sensors are useful because they are so flexible. With the right algorithm, an image sensor can sense or detect practically anything. But there are two drawbacks with image sensors: 1) they output lots of data, dozens of megabytes per second, and 2) processing this amount of data can overwhelm many processors. And if the processor can keep up with the data, much of its processing power won’t be available for other tasks. Pixy2 addresses these problems by pairing a powerful dedicated processor with the image sensor. Pixy2 processes images from the image sensor and only sends the useful information (e.g. purple dinosaur detected at x=54, y=103) to your microcontroller. And it does this at frame rate (60 Hz). The information is available through one of several interfaces: UART serial, SPI, I2C, USB, or digital/analog output. So your Arduino or other microcontroller can talk easily with Pixy2 and still have plenty of CPU available for other tasks. It’s possible to hook up multiple Pixys to your microcontroller — for example, a robot with 4 Pixy2's and omnidirectional sensing. Or use Pixy2 without a microcontroller and use the digital or analog outputs to trigger events, switches, servos, etc. ===== Controller support ===== Pixy2 can easily connect to lots of different controllers because it supports several interface options (UART serial, SPI, I2C, USB, or digital/analog output), but Pixy began its life talking to Arduinos. We’ve added support for Arduino Due, Raspberry Pi and BeagleBone Black, as well as LEGO Mindstorms EV3. Software libraries are provided for all of these platforms so you can get up and running quickly. Additionally, we’ve added a Python API if you’re using a Linux-based controller (e.g. Raspberry Pi, BeagleBone). ===== 60 frames per second ===== What does “60 frames per second” mean? In short, it means Pixy2 is fast. Pixy2 processes an entire image frame every 1/60th of a second (16.7 milliseconds). This means that you get a complete update of all detected objects’ positions every 16.7 ms. At this rate, tracking the path of falling/bouncing ball is possible. (A ball traveling at 40 mph moves less than a foot in 16.7 ms.) If your robot is performing line following, your robot will typically move a small fraction of an inch between frames. {{page>wiki:v2:color_connected_components&noindent}} {{page>wiki:v2:line_tracking&noindent}} {{page>wiki:v2:video&noindent}} ===== PixyMon lets you see what Pixy2 sees ===== PixyMon is an application that runs on Windows, MacOs and Linux. It allows you to see what Pixy2 sees, either as raw or processed video. It also allows you to configure your Pixy2, set the output port and manage color signatures. PixyMon communicates with Pixy2 over a standard mini USB cable. PixyMon is great for debugging your application. You can plug a USB cable into the back of Pixy2 and run PixyMon and then see what Pixy2 sees while it is hooked to your Arduino or other microcontroller — no need to unplug anything. PixyMon is open source, like everything else. {{wiki:v2:color_tracking.png|PixyMon}} ===== Technical specs ===== * Processor: NXP LPC4330, 204 MHz, dual core * Image sensor: Aptina MT9M114, 1296x976 resolution with integrated image flow processor * Lens field-of-view: 60 degrees horizontal, 40 degrees vertical * Power consumption: 140 mA typical * Power input: USB input (5V) or unregulated input (6V to 10V) * RAM: 264K bytes * Flash: 2M bytes * Available data outputs: UART serial, SPI, I2C, USB, digital, analog * Dimensions:1.5” x 1.65” x 0.6” * Weight: 10 grams * Integrated light source, approximately 20 lumens {{wiki:v2:pixy2_front_labeled.jpg?640}} {{wiki:v2:pixy2_back_labeled.jpg?640}} This [[wiki:v2:port_pinouts|page has pinout information]]. Where Can I Buy?