This may be possible, but the developer would need to go in to the image and change the pixels individually.. Then they'd need to save every possible image based on the user's possible shake/accelerometer movements. And then they'd need to implement logic to determine which image to show based on the user's shaking..
And at the end, the animation would still be shaky and likely slow your entire phone down.
This may be possible, but the developer would need to go in to the image and change the pixels individually.. Then they'd need to save every possible image based on the user's possible shake/accelerometer movements. And then they'd need to implement logic to determine which image to show based on the user's shaking..
And at the end, the animation would still be shaky and likely slow your entire phone down.
This may be possible, but the developer would need to go in to the image and change the pixels individually.. Then they'd need to save every possible image based on the user's possible shake/accelerometer movements. And then they'd need to implement logic to determine which image to show based on the user's shaking..
And at the end, the animation would still be shaky and likely slow your entire phone down.
What makes me highly doubt this is what was done for the iphone/touch??