2017/07/12

Demo movie using ARKit in iOS 11

I uploaded a demo movie to YouTube. The video was made with iMovie.



↑At first, the app found a plane with ARKit and made an unseen static physics object with SceneKit.
I made dae files of a box of Nintendo Switch and Arms with Blender. And they were imported to SceneKit in Xcode.
ARKit is really amazing!



iOS 11 beta 3
Xcode 9 beta 3
Develop | Comments(0) | Trackback(0)
2017/02/20

Import dae file from MakeHuman to SceneKit

MakeHuman | Open source tool for making 3d characters
I wanted to use model data created in this MakeHuman(↑) in SceneKit. So I tried many ways to do. Next is one of what I did and worked.


Screenshot of MakeHuman

↑Make model and face expression in MakeHuman. You have to make objects like teeth or tongue, because there might be no data at default.

Output dae file in MakeHuman. Then a dae file and a folder with some textures appears.

Drag and drop them to Xcode, keeping their hierarchy.

You have to set texture images at Scene Editor in Xcode. For example, you may have to set diffuse image as teeth.png for teeth object.



↑Then it worked.


Next are what I did and failed.

(A) I saw somewhere webpage… I tried like below :
1)Output dae file in MakeHuman
2)Import them in Blender and set textures and export dae file
3)Import them in Xcode
But as I wrote, I don’t have to use Blender.

(B) Use ManuelBastioniLAB for Blender



Screenshot of ManuelBastioniLAB

↑This library renders human like this. It looks nice, but when I export this data as dae file, and import them to Xcode, it could NOT draw like what I expected… I don’t know what to do and gave up.

Screenshot of Scene Editor

↑And when I bought dae data from Blender, the model always rotated 90 degree in Scene Editor, but it appeared in running as I expected.



*versions
macOS Sierra 10.12.3
Xcode 8.2.1
iOS 10.2.1
MakeHuman 1.1.0
Blender 2.78a
ManuelBastioniLAB version 1.4.0a
Develop | Comments(0) | Trackback(0)
2017/02/08

Released new VR app for iOS

This is my new iOS app, VR CROSS ROAD. You can enjoy this with your VR glass.
Link and screenshots :
VR CROSS ROAD

VR CROSS ROAD screenshot 1

VR CROSS ROAD screenshot 2

VR CROSS ROAD screenshot 3

VR CROSS ROAD screenshot 4

VR CROSS ROAD screenshot 5

When you set your iPhone to your VR headset, you usually can NOT control the device. Therefore you can just watch at the most of the VR apps for iOS.
Though you can control the movement of the robot with sound in this app. The app detects sound by the microphone and the volume gets over threshold, the robot starts walking only the distance of 1 lane. Clap or tut or anything is ok.

The app is free, and includes in-app-purchase to unlock new objects.

Develop | Comments(0) | Trackback(0)
2016/09/01

How to save/remake MTLTexture to/from memory

I have been making a VR game app, and this below is a short video of the game.



After collision (means Game Over), the scene is displayed by a short animation. This post is about how to do it.

You need three render pass (for right eye, left eye, and record), and the render pass for record outputs to a texture (not display).

You can get row data of the texture withe next method :

- (void)getBytes:(void *)pixelBytes bytesPerRow:(NSUInteger)bytesPerRow fromRegion:(MTLRegion)region mipmapLevel:(NSUInteger)mipmapLevel


You have to allocate memory for what you need in advance.

{
    MTLRegion region = MTLRegionMake2D(0, 0, ImageWidth, ImageHeight);//172x129
    int bytesPerPixel = 4;
    int bytesPerRow = bytesPerPixel * ImageWidth;
    UInt8 *recordingBuffer = (UInt8 *)&imageBuffer[unitSize * imageIndex];
    [texture getBytes:recordingBuffer bytesPerRow:bytesPerRow fromRegion:region mipmapLevel:0];
}
unitSize is the number of memory for one texture (bytes).
By incrementing imageIndex, the app save data to proper address continuously. Of course imageIndex has maximum value and when it gets over the limit, imageIndex becomes zero, so the app overwrites against the old data.
The type for row data of MTLTexture is UInt8.
This process is done in every frame all the time.

And when you have to show the replay, the app makes id<MTLTexture> objects from saved data through UIImage objects. The code (from saved data to id<MTLTexture> ) is like this :

+(UIImage *)makeImageFromUInt8Data:(UInt8 *)buffer size:(CGSize)size
{
    size_t width = size.width;
    size_t height = size.height;
    size_t bitsPerComponent = 8;
    size_t bitsPerPixel = 32;
    size_t bytesPerRow = 4*width;
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrder32Little | kCGImageAlphaFirst;
    bool shouldInterpolate = 1;
    CGColorRenderingIntent intent = 0;
    
    CFDataRef data = CFDataCreate(NULL, buffer, width*height*4);
    CGDataProviderRef   dataProvider = CGDataProviderCreateWithCFData(data);
    
    CGImageRef cgimage = CGImageCreate(
                            width, height,
                            bitsPerComponent, bitsPerPixel, bytesPerRow,
                            colorSpace, bitmapInfo, dataProvider,
                            NULL, shouldInterpolate, intent);
    
    UIImage *image = [UIImage imageWithCGImage:cgimage];

    CGImageRelease(cgimage);
    CGDataProviderRelease(dataProvider);
    CGColorSpaceRelease(colorSpace);
    CFRelease(data);
    
    return image;
}

Develop | Comments(0) | Trackback(0)
2016/07/20

Realtime sound FFT analysis with microphone of iPhone

I was not able to use the accelerometer of iPhone as ‘input’ of VR apps. So I tried to use sound, for example, when you made a clap sound, a missile fired.

And I found one good webpage :
Capture iPhone microphone | Stefan Popp

With this page, I finally got each spectrum of sound from iPhone’s microphone.

Tone Generator

↑I downloaded this tone generator app to my iPad to check my code actually got correct frequency.

real time FFT result

↑I just moved the place of UIView object by the source sound ,where vertical axis is frequency (logarithm) and horizontal axis is spectrum.
According to this, my code might analyze sound correctly.


But… I was not able to find the proper way to compare sounds. If the app must recognize a clap sound, it should have got clap-sound spectrums. And the app must analyze and compare real time sound with the one got previously.
At first, I just found the total difference values of spectrum but failed because error was too large. I tried other ways but all failed.
So I gave up FFT analysis. I would just check sound level to recognize as ‘input’ … orz
Develop | Comments(0) | Trackback(0)
 | HOME | Next »