Main / Personalization / Glcameraripple
File size: 436mb
21 Feb iOS or later. This app will only work on device because of camera input requirement. Due to the app's heavy use of CPU, performance may. 21 Feb Illustrates how to capture YUV camera frames and process them using OpenGL ES 7 Sep Translated by OOPer in cooperation with liveattheenclave.com, on /9/7. As this is a line-by-line translation from the original sample code, "redistribute the Apple Software in its entirety and without modifications" would apply. You should not contact to Apple or SHLab(jp) about any faults.
This is a port of Apple's GLCameraRipple sample to C# and MonoTouch. http:// liveattheenclave.com#samplecode/GLCameraRipple/Introduction/ Intro. The way the camera rippling effect works is it deforms texture coordinates on a grid of triangles -- so the good news is that the effect you want is. 28 May The Apple GLCameraRipple demo is pretty cool liveattheenclave.com library/ios/#samplecode/GLCameraRipple/Introduction/liveattheenclave.com
2 Mar I re-examine the change and find there is some sample code called GLCameraRipple from Apple which pretty much does same thing - color. Illustrates how to capture YUV camera frames and process them using OpenGL ES i want a water ripple effect on my background sprite but CCRipple3D gets a low efficiency when i scroll on the screen and i saw there is a apple offical project. 6 Feb liveattheenclave.comified to use Open GL ES (vs. ). This sample is ideal for beginner iOS developers who want to use the latest. liveattheenclave.com in monotouch-samples located at /GLCameraRipple.
年11月16日 _AppDelegate.h GLCameraRipple\GLCameraRipple\AppDelegate.m __ MACOSX\GLCameraRipple\GLCameraRipple\._AppDelegate.m. Take a look at this example from Apple liveattheenclave.com samplecode/GLCameraRipple/Introduction/liveattheenclave.com the live capture as a texture - to give appearance of ripples. • Looks awesome. GLCameraRipple. URL: liveattheenclave.com 25 Dec GLCameraRipple. This sample demonstrates how to use the AVFoundation framework to capture YUV frames from the camera and process.