All According to Plan...
I was trying to learn some c++ & make, for the beginnings of a long journey ahead of me as I try to become more familiar with the computing world ![]()
we live in and where it came from. This branch I made for this post, crashes after 5 hours on my target machine (an Acer Chromebook 14 ), so I waited all year trying to make somthing so that i could write this: I wanted to start a cycle
where I mess around with an interest -> which leads me to a project -> then after it's done, I would take a break to reflect and float
on to the next area that the experience pushed me towards. However, this project didn't exactly go as planned, so my imagined routine was shattered, and this site fell silent...
But hey. It doesn't matter. I felt like I had nothing to say because I wanted to only bring you the best experiences to learn from. This feeling pushed me away from writing, but as I naturally came into contact with new ideas, my mind
started turning. Eventually, I did do something I really want to share -> a major checkpoint in the journey I mentioned earlier.-So I opened up this website, saw this stub of a post, and thought, hey, I actually did learn a ton while working on this! It's not right to only share things that I consider working since that's not how life works; in reality, sometimes things don't work out perfectly ⛓️💥. Programming is not about making money and tools for businesses. This project helps deliver fun visual performances...not shareholder value.
Fun the hard way: limited resources
My branch is 100% solvable, but at some sort of cost: I could have spent a lot more time on arch-specific, non-c++, related solutions that wouldn’t have ever really worked, as maybe they were meant to...
Why would we want X11 captures as a source on a pi ? On my target machine, I was using a basic browser to render some shaders on Shadertoy [1 2] & a clock. It was just too much to ask the chromebook to generate images while also capturing & mapping them. The pi / host connected to the projector should really only be the projection mapper; the one warping the sources correctly
, to make them appear flat. The source of what is being projected should be remote and more powerful, but these solutions introduce the need for streaming and its laggy effects.
The Shadertoy UI offers the option to record frames as they're produced and save them as a .gif! The project I chose to work with, ofxPiMapper, is great at mapping .gifs using low-power pis. However, if we really wanted to, for example, use a camera and only animate the shader when people move
or control it with their hands
; if we wanted to generate the frames to be mapped instead of using pre-generated frames, ofxPiMapper might not be the best fit for us. The server-client extensions for ofxPiMapper only offer remote control to make mapping easier, not remote content streaming for enhanced experiences.
What I could do 
My branch is a new source extension for ofxPiMapper, XSource that lets you pick any open X11 window and capture what its displaying as a mapping source. Really, all I wrote was this 1 gstreamer pipeline; the rest was all pretty much made for me. I copied the basic source example and used the camera source as a reference to make a new source, called XSource. I was able to use ofx std lib's videoUtils.setPipeline(), which took care of running the pipeline for me:
std::string pipeline = "ximagesrc xid=" + std::to_string(targetWindow) + " use-damage=false ! "
"video/x-raw,format=BGRx,framerate=60/1 ! queue";
In the Xsource's ::update function, called when the application wants a new frame
from the source, I read the running gstreamer pipeline
and copied the pixels to the texture that I've registered as my source:
if (videoUtils.isFrameNew()) {
videoPixels = videoUtils.getPixels();
if (videoPixels.isAllocated()) {
// Allocate texture once
if (!videoTexture.isAllocated()) {
videoTexture.allocate(videoPixels.getWidth(), videoPixels.getHeight(), GL_RGBA);
}
// Upload new frame to texture
videoTexture.loadData(videoPixels);
}
}
I ran the same gstreamer pipeline outside of ofx using gst-launch-1.0, and it also crashed after many hours. I couldn't figure out why, and I didn't really care too much at this point since I knew the solution was elsewhere, not in fixing this. This was also my first time trying arch linux, so I could just be bad at reading logs and underestimating my boi EDGAR
Results 
Note: I know it seems otherwise but ofx is cross-platform. videoUtils smartly uses other solutions on Windows
when it can. ofx was dope.
Note 2: Fun Thread where all of us were trying to get ofxPiMapper up.