One of the first things I did with my Pi was to try and figure out what's possible in terms of graphics. This included a fun exploration of OpenVG (and very simple particle effects) on the Pi. I recommend checking out these projects, both of which I'll definitely be playing with more:
- OpenVG Testbed - If you don't mind getting your hands dirty with C. Examples and useful helper functions.
OpenVG & Node
I'd heard about OpenVG before my Raspberry Pi arrived. At work we were using a similar bit of hardware, and achieving any sort of hardware-accelerated graphics was a massive bugbear. The RPi community is pretty amazing, and I enviously eyed the various graphics libraries that were springing up in my favourite languages. The node-openvg-canvas project appealed especially, as it implements the HTML5 Canvas API in OpenVG. This means:
- It's easy to grab demos from the web and try them out on the Pi with little-to-no conversion.
- You can also use other libraries that build on the Canvas API, such as a charting library like Flot, theoretically even awesome libraries like three.js, which can render to Canvas (just one of its choices of renderers).
Step one for me was to grab the first HTML5 canvas particle demo I could find online to see how it ran with node-openvg-canvas.
It worked with minimal changes (my changes were backwards-compatible with the browser), but I couldn't add many particles to the scene before the framerate started to suck. My limit was a disappointing 20 particles. You can grab the code (and usage instructions) from this gist. Here's a video:
The next thing to try was to go low-level (C) to get a feel for the performance I could expect, by using the raw OpenVG API. I came across the OpenVG Testbed project by @ajstarks, he's blogged about it here. It's a set of examples and handy helper functions for interacting with OpenVG in C.
I wanted something a bit more 'particley' too, so I watched this talk (from 5m 24s) for a quick crash course in faking physics from Seb Lee Delisle, king of particle demos.
It was a pretty straightforward task, with the testbed providing some useful help. Performance wasn't massively improved over my Node efforts, I could get about 50 particles animated smoothly if I didn't care about monopolising ~45% of the CPU in the process.
I'm not comparing like for like demos, but my feeling here is that these sorts of naive implementations are going to be similarly performant using either approach (Node bindings or raw C API). Both are calling the same OpenVG functions under the hood, and Node isn't bringing much overhead when calling them in a single loop.
I suspect I'm missing a trick in taking advantage of some transform-related trickery, perhaps not. Material on the raw OpenVG APIs is frustratingly scarce.
So here's the result currently. Nothing mindblowing, but good fun and the fairly pleasing screensaver feel has given me ideas for further tinkering. (Code and usage instructions here).
I did find applications where working with the C APIs was advantageous. One of my first experiments with the OpenVG Testbed was to create a scrolling 'ticker' line of text. The testbed allowed me to take advantage of lower-level OpenVG functions like
vgTranslate(). The result is a nice, smooth scrolling line (at 1080p). There's a font conversion tool included in the testbed too,
font2openvg, and I managed to convert a decent font over without much trouble. (Video).
That's all for now. If you've managed to get more out of your Pi, give me a shout on twitter.