Now I know that Quartz Composer essentially turns Quicktime movies into a series of frames so it can scan back and forth through them, but does it do that with a live video feed (ie. turn the video into frames and put them on screen on the fly)?
Anyway my thought is this: I wonder if you could have some sort of video buffer, that other applications (be that Flash, Director, VJ apps, etc.) could render to (assuming they had that built in or could be extended to do so), that could then be grabbed and manipulated by Quartz Composer.
Come on Apple, give us a proper plug-in API. Then we might stop whingeing!
By no means have I explored the fullness of what QC can do, but I do keep finding that I run into difficulties with some of my ideas because of limitations in it. Perhaps I need to keep playing within the limitations so that they inform my ideas, rather than have the ideas and then find it nigh on impossible to implement them in QC. Ah well.
SteamSHIFT out.
Technorati Tags:
quartz composer, theory, musing, art, design, motion graphics