Cart 0

Pixy 1.0 - Vision Sensor for the Arduino

Deliveryfrom just £2.50

20,000+customer reviews

No Quibblereturns policy

Securepayment system

Next Day DeliveryOrder by 2PM*

Pixy 1.0 - Vision Sensor for the Arduino - MISC - The Pi Hut
Sold Out
£50.00

We will notify you when this product becomes available.



Share this Product


Pixy is a fast vision sensor you can quickly “teach” to find objects, and it connects directly to Arduino and other controllers.

Background

Image sensors are useful because they are so flexible. With the right algorithm, an image sensor can sense or detect practically anything.  But there are two drawbacks with image sensors: 1) they output lots of data, dozens of megabytes per second, and 2) processing this amount of data can overwhelm many processors.  And if the processor can keep up with the data, much of its processing power won't be available for other tasks.  

Pixy addresses these problems by pairing a powerful dedicated processor with the image sensor.  Pixy processes images from the image sensor and only sends the useful information (e.g. purple dinosaur detected at x=54, y=103) to your microcontroller.  And it does this at frame rate (50 Hz).  The information is available through one of several interfaces: UART serial, SPI, I2C, digital out, or analog out.  So your Arduino or other microcontroller can talk easily with Pixy and still have plenty of CPU available for other tasks.  

It's possible to hook up multiple Pixys to your microcontroller -- for example, a robot with 4 Pixys and 360 degrees of sensing.  Or use Pixy without a microcontroller and use the digital or analog outputs to trigger events, switches, servos, etc. 

Pixy easily connects to Arduino or other microcontrollers.
Pixy easily connects to Arduino or other microcontrollers.

Purple dinosaurs (and other things)

Pixy uses a hue-based color filtering algorithm to detect objects.  Most of us are familiar with RGB (red, green, and blue) to represent colors.  Pixy calculates the hue and saturation of each RGB pixel from the image sensor and uses these as the primary filtering parameters.  The hue of an object remains largely unchanged with changes in lighting and exposure.  Changes in lighting and exposure can have a frustrating effect on color filtering algorithms, causing them to break.  Pixy’s filtering algorithm is robust when it comes to lighting and exposure changes and significantly better than previous versions of the CMUcam.  

Seven color signatures

Pixy remembers up to 7 different color signatures, which means that if you have 7 different objects with unique colors, Pixy’s color filtering algorithm will have no problem identifying them.  If you need more than seven, you can use color codes (see below).  

Hundreds of objects

Pixy can find literally hundreds of objects at a time.  It uses a connected components algorithm to determine where one object begins and another ends.  Pixy then compiles the sizes and locations of each object and reports them through one of its interfaces (e.g. SPI).  

Pixy can track multiple objects with multiple color signatures (PixyMon view shown.)
Pixy can track multiple objects with multiple color signatures (PixyMon view shown.)

50 frames per second

What does “50 frames per second” mean?  In short, it means Pixy is fast.  Pixy processes an entire 640x400 image frame every 1/50th of a second (20 milliseconds).  This means that you get a complete update of all detected objects' positions every 20 ms.  At this rate, tracking the path of falling/bouncing ball is possible.  (A ball traveling at 30 mph moves less than a foot in 20 ms.)   

Pixy is fast, so your robot can be fast -- shown with glasses and wig (optional).
Pixy is fast, so your robot can be fast -- shown with glasses and wig (optional).

Teach it the objects you're interested in

Pixy is unique because you can physically teach it what you are interested in sensing.  Purple dinosaur?  Place the dinosaur in front of Pixy and press the button.  Orange ball?  Place the ball in front of Pixy and press the button.  It’s easy, and it's fast.    

More specifically, you teach Pixy by holding the object in front of its lens while holding down the button located on top.  While doing this, the RGB LED under the lens provides feedback regarding which object it is looking at directly.  For example, the LED turns orange when an orange ball is placed directly in front of Pixy.  Release the button and Pixy generates a statistical model of the colors contained in the object and stores them in flash.  It will then use this statistical model to find objects with similar color signatures in its frame from then on.  

Pixy being taught.  When color LED matches the color of the object, release button.  Pixy will then find objects that match.
Pixy being taught. When color LED matches the color of the object, release button. Pixy will then find objects that match.

Pixy can learn seven color signatures, numbered 1-7.  Color signature 1 is the default signature.  To teach Pixy the other signatures (2-7) requires a simple button pressing sequence.      

About the cost of two sonar sensors

We’ve done our best to keep the cost of Pixy as low as possible.  Improvements in technology deserve much of the credit, but this Kickstarter campaign is a big help also.  The Kickstarter funds allow us to manufacture in sufficient quantity to get the parts and manufacturing costs down.  The result is that Pixy is available to a wider audience, which has always been the point of the CMUcam: to put a capable, easy to use vision sensor in the hands of lots of people.

PixyMon lets you see what Pixy sees

PixyMon is an application that runs on your PC or Mac.  It allows you to see what Pixy sees, either as raw or processed video.  It also allows you to configure your Pixy, set the output port and manage color signatures.  PixyMon communicates with Pixy over a standard mini USB cable.

PixyMon is great for debugging your application.  You can plug a USB cable into the back of Pixy and run PixyMon and then see what Pixy sees while it is hooked to your Arduino or other microcontroller -- no need to unplug anything.  PixyMon is open source, like everything else.  It's written using the Qt framework.  

PixyMon runs on PC or Mac and allows you to see what Pixy sees.  Communication takes place over standard USB cable.
PixyMon runs on PC or Mac and allows you to see what Pixy sees. Communication takes place over standard USB cable.

What’s a “color code”?

A color code (CC) is two or more color tags placed close together.  Pixy can detect and decode CCs and present them as special objects.  CCs are useful if you have lots of objects you want to detect and identify (i.e. more than could be detected with the seven separate color signatures alone.)  

Simple color code (left), as Pixy sees it (right, shown via PixyMon)
Simple color code (left), as Pixy sees it (right, shown via PixyMon)

In the video, we created CCs with 2 color tags using 4 different colors (identified by teaching Pixy 4 different color signatures).  Depending on your requirements, such a scheme (2 tags, 4 colors) can detect up to 12 unique objects.  CCs with 3, 4 and 5 tags and/or more different colors are possible and can allow for many, many more unique objects.  (In fact, thousands of unique codes are possible by using CCs with 5 tags and 6 colors.  Hello inkjet printer!)  

Why Color Codes?  

CCs are useful if you have lots of objects you want to detect and identify, more than could be detected with the seven separate color signatures alone. CCs also improve detection accuracy by decreasing false detections.  That is, there is a low probability that specific colors will occur both in a specific order and close together.  The drawback is that you need to place a CC on each object you’re interested in detecting.  Often the object you’re interested in (yellow ball, purple toy) has a unique color signature and CCs aren’t needed.  Objects with CCs and objects without CCs can be used side-by-side with no problems, so you are free to use CCs for some objects and not others.

CCs give you an accurate angle estimate of the object (in addition to the position and size).  This is a computational “freebie” that some applications may find useful.  The angle estimate, decoded CCs, regular objects and all of their positions and sizes are provided at 50 frames per second.   

CCs might be particularly useful for helping a robot navigate.  For example, an indoor environment with CCs uniquely identifying each doorway and hallway would be both low-cost and robust.  

 

Further details and guides can be found on the Pixy Wiki


Share this Product

Pixy 1.0 - Vision Sensor for the Arduino has a rating of 5.0 stars based on 4 reviews.

More from this collection