Friday, June 5, 2009

「夫婦箸」, Meotobashi chopsticks

It’s a typical Japanese gift. Although, chopsticks is wildly used in Asia, but Japanese do push chopsticks to a remarkable level in everyday life. They use chopsticks as gifts, and they developed a strict theory of chopsticks manner, which is much more strict then Chinese or other Asia countries. For example, in almost every Japanese family, every member of the family has his/her own fixed chopsticks and bowl. I did not see any other countries in Asia has such a manner.

A typical Japanese chopsticks[From Wikipedia]

Meotobashi, some kind of gift chopsticks, is wildly used as present to new couples when they get married. It’s usually placed in a beautiful box with 2 pairs of chopsticks. Same color, same design, some of them have the same length, and some have a shorter pair for woman. Means husband and wife will always, yes, they mean this life, eat the meal by this pair of chopsticks together.

2 typical Meotobashi from

If you want to have some Japanese gifts for your friends, especially for a couple, you can take this choice, and explain this to them, I am pretty sure they will be happy with this.

Sunday, May 31, 2009

iPhone 3rd Generation New Features Exposed

Hey, UMPCFever, You are not only the first one exposed iPhone 3Gen in HK, but also, you are the first one in the world who brought these new features pictures to us.

[Sorry, I used your pictures directly!]

Auto focus + Video Recording

Auto Focus

Taken by Auto Focus

Taken by iPhone 2Gen

Digital Compass

CPU Activity (For test only, the retail machine will hide this function, I guess)

Cool! I can not wait to see it!

AVAudioPlayer could not play after recording via AudioQueue Object.


I've been fighting for AVAudioPlayer and AudioQueue object for 2 days,

Here is my Dev environment:
iPhone: 3.0Beta 5
OSX 10.5.7

Here is my scenario:
1. I need to use AVAudioPlayer to play a CAF with IMA4 data format. Including fast forward and rewind.
2. I need to use AudioQueue Object to record audio, because 3.0 has not go public yet.
3. A common scenario is record an audio A, then play the original audio A'(Not the one recorded) or vice versa.

The problem is I could NOT play the A' file  after I record file A.
[theAVAudioPlayer play] always return NO, but [theAVAudioPlayer prepareToPlay] return YES.
This situation only happened after I used the AudioQueue Object for recording. If I did not record, it always worked fine.

About the clean up.
1. I released the theAVAudioPlayer object and set to nil each time after I finished playing.
2. I closed the audio file and deposed the AudioQueue Object each time after I finished recording.


Thanks to Vitali Molodtsov[Open the original post Needs login], the solution is reset the AudioSessionCategory to kAudioSessionCategory_MediaPlayback just after finish recording. Then every thing goes fine.


I think this is a bug for AVAudioPlayer. Each time, when we initialize an AVAudioPlayer object, it should set the correct AudioSessionCategory for us silently, but it doesn’t. So I’ve fired a bug for Apple.

Saturday, May 30, 2009

The Best Way to Declare a Constant in Objective-C

Thanks to Barry Wark From [Here]

Create a header file like

// Constants.h
extern NSString * const MyFirstConstant;
extern NSString * const MySecondConstant;

You can include this file in each file that uses the constants or in the pre-compiled header for the project.

You define these constants in a .m file like

// Constants.m
NSString * const MyFirstConstant = @"FirstConstant";
NSString * const MySecondConstant = @"SecondConstant";

Constants.m should be added to your application/framework's target so that it is linked in to the final product.

The advantage of using string constants instead of #define constants is that you can test for equality using pointer comparison (stringInstance == MyFirstConstant) which is much faster than string comparison ([stringInstance isEqualToString:MyFirstConstant]) (and easier to read, IMO).

Monday, May 25, 2009

A Weird Exception from iPhone Simulator

Today I met a weird exception from AUGraphStart().

Basically, I was running an old sample code written by myself. The main function is mix 2 audio files a playback them simultaneously. I worked fine before, but when I run it, I got this exception:

[00:23:58.425 <AURemoteIOServer>] AQMEIOBase::DoStartIO: timeout
[00:23:58.701 <AURemoteIOServer>] AQMEDevice::StartIO: AudioOutputUnitStart returned -66681
[00:23:58.701 <0xa084a720>] AUIOClient_StartIO failed (-66681)
AUGraphStart FFFEFB87

Obviously, it wasn't came from my code, and I have no idea on this exception now. Did anybody get a same exception recently?

I had tried these:

   1. Restart the XCode. -- No effects.

   2. Restart the machine(Mac Pro, 10.5.7), and restart the XCode. -- No effects.

   3. Reinstall the most recent SDK(Build 9M2735), restart the XCode. -- No effects.

   4. Set the base SDK to 2.2.1 and target to Simulator 2.2.1 Debug. -- No effects.

It's really odd!

Please, if anybody had met similar exception and got a solution, help me out!

Still don’t know how to resolve it now. I need  help!

[Update 2009-06-11]

A week ago, Apple’s engineer had confirmed that they had a bug in UIKit framework.

We do have a simple workaround for this UIKit bug. In change this:

  UIKIT_EXTERN @interface UILocalizedIndexedCollation : NSObject


  UIKIT_EXTERN_CLASS @interface UILocalizedIndexedCollation : NSObject

Having done that, we can build your project and confirm that AUGraphStart returns 0 in the 3.0 simulator.


But, after I installed the 3.0 GM Version today, it’s still not included in the GM Version. So that means I still can not play audio in the simulator. I can ONLY do it on the device!

Friday, May 15, 2009



Farewell,, You did bring me a lot of fun. But, it also a burden to me to follow you as you are growing bigger and bigger, stronger and stronger. I am pretty sure that you will get more and more popular, and I am also pretty sure that more and more people will follow you spirit to enjoy everyday life. But, I just can’t. Thank you for all the days with me, bring me fun, and fresh news. From the next minute, you will be removed from my Google Reader, and I will miss you~

Take care!

Thursday, May 14, 2009

Best Audio Format for iPhone Audio Programming

I had never done audio programming before I started my iPhone programming. After starting iPhone programming, I started to learn CoreAudio Framework, Audio Unit for Mac OS X system and iPhone System, and largest problem is which audio format should I choose for best practice?

After iPhone OS 3.0 beta was released, Apple finally introduced AVAudioRecorder class to AVFoundation framework. With which, we can play and record audio more efficiently! For me, I can cut the original playback code down from 400+ lines to 20 lines. And for recording, cut down from 300+ lines to 30 lines.

When testing AVAudioRecorder, the file format will significantly effect the file size. I had tried all the available data format when packed with .CAF file.  Here is the result:

[Updated@2009-05-25] YES, we can record with this format on simulator, but we CANNOT record this on the device! Remember this. While another one we can refer to is Apple’s official QA1615

[Updated@2009-06-15] Please do not use AAC if you want to play a system sound.

[Updated@2009-09-30] iPhone 3GS support AAC recording since iPhone OS 3.1 was released on Sept 9th. You can use hardware assisted encoding to record the AAC formatted audio file. But let me remind you, you can not use AVAudioRecord to record AAC audio file yet. The only option is using the RemoteIO Unit.

  • kAudioFormatLinearPCM               = 'lpcm',    OK    20.2M    .CAF
  • kAudioFormatAppleIMA4               = 'ima4',    OK    2.7M     .CAF [Best Choice]
  • kAudioFormatMPEG4AAC                = 'aac ',    OK    968K     .CAF [Updated 2009-05-25], [Updated 2009-09-30] Best Choice for iPhone 3GS
  • kAudioFormatMACE3                   = 'MAC3',    NG            .CAF
  • kAudioFormatMACE6                   = 'MAC6',    NG            .CAF
  • kAudioFormatULaw                    = 'ulaw',    OK    5.1M     .CAF
  • kAudioFormatALaw                    = 'alaw',    OK    5.1M     .CAF
  • kAudioFormatQDesign                 = 'QDMC',    NG            .CAF
  • kAudioFormatQDesign2                = 'QDM2',    NG            .CAF
  • kAudioFormatQUALCOMM                = 'Qclp',    NG            .CAF
  • kAudioFormatMPEGLayer1              = '.mp1',    NG            .CAF
  • kAudioFormatMPEGLayer2              = '.mp2',    NG            .CAF
  • kAudioFormatMPEGLayer3              = '.mp3',    NG            .CAF
  • kAudioFormatAppleLossless           = 'alac'        OK    4M        .CAF
  • kAudioFormatMPEG4AAC_LD             = 'aacl',    NG            .CAF
  • kAudioFormatAMR                     = 'samr',    NG            .CAF
  • kAudioFormatiLBC                    = 'ilbc',    NG    Wierd    .CAF

The list here has omitted all the other data format that are not supported by CAF file extension. You can use the console application afconvert to verify each supported file format and data format via typing this:

afconvert -h 

All the tests were recorded in 60 seconds, with sample rate@44100, 2 channels, 96kbps. The RecordSetting Dictionary object is used as below:

recordSetting = [[NSMutableDictionary alloc] init];

//General Audio Format Settings (Necessary for all audio format.)
[recordSetting setValue:[NSNumber numberWithInt: kAudioFormatMPEG4AACkAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];

//Linear PCM Format Settings (only necessary when you want to record Liner PCM format)
[recordSetting setValue:[NSNumber numberWithInt: 32] forKey:AVLinearPCMBitDepthKey];
[recordSetting setValue:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];
[recordSetting setValue:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];

//Encoder Settings (Only necessary if you want to change it.)
[recordSetting setValue:[NSNumber numberWithInt:AVAudioQualityMin] forKey:AVEncoderAudioQualityKey];
[recordSetting setValue:[NSNumber numberWithInt:96] forKey:AVEncoderBitRateKey];
[recordSetting setValue:[NSNumber numberWithInt:16] forKey:AVEncoderBitDepthHintKey];

//Sample Rate Conversion Settings (Only necessary when you want to change the sample rate to a value different to the hardware sample rate, AVAudioQualityHigh means no conversion, usually, 44.1KHz)
[recordSetting setValue:[NSNumber numberWithInt:AVAudioQualityHigh] forKey:AVSampleRateConverterAudioQualityKey];

[Updated 2009-09-30], The record settings listed above was for example only, if you want to record IMA4 or other compressed audio format, you are no need to put the Liner PCM part. Other parts are commented for their purpose.

You can see, the MPEG4AAC has the minimum file size and same quality when compare to other file formats. About the AAC format, please refer to [Wikipedia – AAC"]

Apple, you could just give us a best practice guide! So I won't need to test it one by one myself. OK, I know, you are working out of people, hire me, I would love to do these works.

Thursday, May 7, 2009

A good story






我们是沐浴着胡主席访问非洲的春风去的,时机很好,接待的规格也很高。当地部族酋长的弟弟开来两辆宝马745 ,带着随行车队来迎接我们。以后的这些天,我们就一直坐这两辆车,在沙石路上以100英里的时速飞奔。




这里的铁矿露天堆在那里,量很大,有上亿吨;矿石的品位很高,接近 60%的含铁量;每吨才要50美分,很便宜,但运输很难,运到港口就要20多美元,出港又加10美元,运到中国就高达70多美元了。现在海运价格暴涨,运费更高,所以运输是关键。


















60% 左右的铁,8%-12%的钛,就铁的含量而言在中国当然是很好的富矿了,中国铁矿的平均水平才30%,铁含量到64%以上可以直接炼钢。除了铁和钛,其中的硫、磷等杂质都在千分之几,那么还有什么呢?那是氧啊,矿石中所有的铁和钛,都是以氧化铁和氧化钛形式存在的,实际上100%的氧化铁矿石中铁的含量也就是70%-72%。用自己的化学知识一计算,这个样品基本上就是氧化铁和氧化钛的混合物,基本没有其他的成分,它是一个非常好的钛铁伴生矿!所以这个样品不是简单的铁矿石,他们当然要拿回去;否则我们回去给我们的技术人员一分析,就明白里头的道道了。

现在明白了,他们要建造的加工厂实际上就是要把矿石中的氧化钛分离出去,剩下的就是铁。他们建造的不是铁矿石的加工厂,而是一个钛的分离厂,这样他们所需要的投资是国内铁矿石加工厂投资的 200倍以上就很好理解了。而钛在国际上的价格是十几美元一公斤,铁矿石如果刨除运费,在这里的价值最多也就10美元一吨,价值相差1000倍;即使是按照含量10吨原矿出1吨氧化钛、9吨氧化铁矿来计算,价值也有100倍以上的差距。而现在他们把分离出来的钛全部占有,给我们说是按照垃圾处理了,还要分享我们铁矿石的利润。也就是说,中国人投入全部的资金,承担全部的运营成本和风险,而他们获得超过99%的利润,中国人只得不到1%!



















Wednesday, April 29, 2009

Wow! Look at these prizes

Damn it, I might miss this compitition.

Apple Desing Awards

Great Software Deserves Great Prizes
Winners will receive two 15-inch MacBook Pros (best configuration), two 30-inch Apple Cinema Displays, two 16GB iPhone 3Gs, two 16GB iPod touch, an ADC 2009 Premier Membership, and reimbursement of their WWDC 2009 E-ticket.

Student category winners will receive one 15-inch MacBook Pro, one 30-inch Apple Cinema Display, one 16GB iPod touch, one 8GB iPhone 3G, one ADC 2009 Student Membership, and reimbursement of their WWDC 2009 E-ticket.

Entries will be accepted beginning Thursday, April 2 and ending at 5PM (PST) Monday, May 4, 2009.

OK... Hope for the next time. I am pretty sure Apple will run this next time.

What could error code 560030580 means?

When Apple use the error enumerations, it used a fancy cool tip to define the error code. (I would like to call this magic error number, whatever apple call it.)

Just look at this:

kAudioFormatUnsupportedDataFormatError 1718449215 = ‘fmt?’
The playback data format is unsupported (declared in AudioFormat.h).
Available in iPhone OS 2.0 and later.

When you run you app and get an error code, the only thing you could see in the console is an error number like 1718449215 or 560030580, if you don't know what this magic number is, you would have no idea what this f**king number is. But if you paste this number to calculator app(I mean the system calculator application as below), you would see the real meaning in these number.

[caption id="attachment_259" align="aligncenter" width="362" caption="Paste the magic error number and see the real meaning"]Paste the magic error number and see the real meaning[/caption]

Did you see? in the left down corner, there is an string "!act" that's what this number means.

The main idea behind this magic error number is how to interpret the 4 bytes int32 data type. When we define a number in C or in C++ and Obj-C, you can say:

Int32 a = 'abcd';

This is legal and cool, it gives every number an abbreviated meaning.

Well, Frankly, it is a good tip! But the nightmare is Apple did not use it everywhere, it just adopted some of these mechanism into the current iPhone API, especially, in CoreAudio, so if you find some error code is strange, just copy it, paste it to the calculator, see what you can get. It might be help.

Tuesday, April 28, 2009

I am at Twitter now!

OK, I admit that I've been out of the time for a while, now, I am back with twitter!

[caption id="attachment_254" align="alignleft" width="300" caption="Wow~~, me!"]Wow~~, me![/caption]Twitter me Now!

Sunday, April 19, 2009

What a real swipe should be?

What a real swipe should be ?

Before we get the correct answer, let see what Apple’s engineer told us to do. Look at this video[Matt Drance:Advanced UIKit and Device Features@iPhone Tech Talk World Tour] and seek for 47:15 and you will see, Matt Drance told us to do like this:

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event

    // “start” is an instance variable
    start = [[touches anyObject] locationInView:self];

//Slide sub-title: Analyze direction and speed


#define HORIZ_DRAG_MIN 12

#define VERT_DRAG_MAX 4

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
    CGPoint current = [[touches anyObject] locationInView:self];
    float deltaX = fabsf(current.x - start.x);
    float deltaY = fabsf(current.y - start.y);

    if ( deltaX >= HORIZ_DRAG_MIN  && deltaY <= VERT_DRAG_MAX)

        // it’s a swipe
        // it’s not a swipe

OK, I have to say, when you first read this, it did sound reasonable to determine a serials touch events to be a swipe or not. But when you really test it, you will see, you are missing a serious scenario:

Touch Down –> Move to the right –> then Move to the left quickly(You even don’t need to be so quick.)

Or, you can do it vice versa. You will only get the swipe right result. That’s not we really wanted, didn’t we?

The problem lies in how we determine the swipe action. If you test the above scenario in UITableView which is automatically taking care of swipe for us, you will see the UITableView works pretty fine. When you move to the back direction, it does move the table view to the correction direction as we wanted. That means we haven’t caught the right swipe events. So, where is the problem? The problem is Matt was wrong, or at least he did not tell us the full image of a swipe event.

He did write that we need to analyze the direction and speed. The solution to this is analyze the speed between the current movement and the movement just happened right before current one.

As a conclusion, you need to use:

     moveSpeed = fabsf(deltaX / deltaTime);

Where deltaTime is the subtraction of 2 timestamp of 2 movement event. When this speed is over 100 pixels/second, you will can say, this is a swipe action. (100pixel/s is my judgment, you might choose higher or lower speed.)


Here is the full story.

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
    UITouch *touch = [touches anyObject];
    // Record the start point
    theSecondLastPosition = [touch locationInView:self];
    theSecondLastTime = touch.timestamp;

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
    UITouch *touch = [touches anyObject];
    CGPoint currentTouchPosition = [touch locationInView:self];
    float deltaX = currentTouchPosition.x - theSecondLastPosition.x;
    float deltaY = currentTouchPosition.y - theSecondLastPosition.y;
    moveSpeed = fabsf(deltaX / (touch.timestamp - theSecondLastTime));

    // If the touch event falls into following pattern, it appears to be a swipe
    if ( moveSpeed >= SWIPE_SPEED)
        if (theSecondLastPosition.x < currentTouchPosition.x){
            swipeType = SWIPE_RIGHT;
            swipeType = SWIPE_LEFT;
        // Process a non-swipe event.
        swipeType = NOT_A_SWIPE;

    theSecondLastPosition = [touch locationInView:self];
    theSecondLastTime = touch.timestamp;

Wow~~~, enjoy your new swipe events!

Saturday, April 18, 2009

More about AudioUnitSetParameter

Apple release iPhone SDK based on Mac OS 10.5, and they used a lot of concept which is used in Mac OS directly in iPhone development. But to tell the truth, even the Mac development documentation is still lack of a lot of details, so Apple just leave those jungles directly to iPhone developers. Today, let talk about AudioUnit and the parameters.

AudioUnit is a good library, it provides a lot of predefined processes for us to deal with the audio data. But I have to say, Apple, you really leave a huge jungle in AudioUnit. You can almost find no useful documents for you to understand the APIs. If you want to use an AudioUnit, you do need to invoke a lot of function in a certain sequence. Otherwise, the AudioUnit will refuse to work.

After you properly set up the AudioUnits and AUGraph, you still might be confused when you want to just some properties of a AudioUnit on the fly, you will find out that you are falling into another jungle. You will hard to know which parameter to use and how to use it. Because All you can access the parameters of an AudioUnit is AudioUnitSetParameter. If you look into this API's document, you will see what the real jungle is:

Sets the value of an audio unit parameter.

OSStatus AudioUnitSetParameter (
AudioUnit inUnit,
AudioUnitParameterID inID,
AudioUnitScope inScope,
AudioUnitElement inElement,
AudioUnitParameterValue inValue,
UInt32 inBufferOffsetInFrames

The audio unit that you want to set a parameter value for.

The audio unit parameter identifier.

The audio unit scope for the parameter.

The audio unit element for the parameter.

The value that you want to apply to the parameter.

Set this to 0. To schedule the setting of a parameter value, use the AudioUnitScheduleParameters function.

Return Value
A result code.

Available in iPhone OS 2.0 and later.
See Also
Declared In

See, the first parameter is easy to understand, it means which AudioUnit you want to set/change a parameter with.
the second one will be a little confused, because you need to find out each AudioUnit has which kind of parameters at all. Well, to tell the truth, if you don't know a parameter name, there is no way for you to jump to the parameters defination! So, here is what might be help to you. [Click Here(Local help URL, if you don't have iPhone SDK documentations installed in your system, don't click this URL)]. And don't forget to bookmark it! ;-p

Friday, April 17, 2009

A “Retain” Trap

I just met a weird problem when using UIButton to create a customized button. The object will be released automatically even I stated a property with retain.

Before I merged this code to my main development version, I tested and run it on a test project and it worked pretty fine.

UIBUtton *myButton;

@property(nonatomic, retain) UIButton *myButton;

then I created this button:

myButton = [UIButton buttonWithType:UIButtonTypeCustom];

and used this button in another method(message):

myButton.frame = …;

When I set the myButton.frame property, it always return “EXC_BAD_ACCESS”, or some weird “Selector is not found on UILayer…” blah blah…

While this snippet worked fine in my test program. After some quick investigation, the problem lies on how we maintain the object after we create it. buttonWithType is a static method for UIButton, so any object created by static method will be added to the auto-release pool, and when this object will be released will be unpredictable. Even we had stated the UIButton as a property which has the retain property, it still need to do the retain by our own hand. Like this:

myButton = [[UIButton buttonWithType:UIButtonTypeCustom] retain];

Wow, it works fine now.