开发者

iOS: Audio: Code needed for reading, processing, writing sound files + MIDI Processing [closed]

开发者 https://www.devze.com 2023-01-28 06:24 出处:网络
Closed. This question needs details or clarity. It is not currently accepting answers. Want to improve this question? Add details and clarify the problem by editing this post.
Closed. This question needs details or clarity. It is not currently accepting answers.

Want to improve this question? Add details and clarify the problem by editing this post.

Closed 6 years ago.

Improve this question

What I think I need to do is create some code that lets me read in a bunch of raw sound files (ie a complete sound font for say a guitar), proce开发者_如何学Csses these files (to construct chords), and outputs the result as another set of files.

My question: can anyone point me to some code that does something close to this task, that would save me from having to do everything from scratch?

EDIT: answer below have suggested I use garage band which I have had a look at. it looks like a great tool. I can construct my 24 chords on garage band. but then I need to save it as MIDI, and write my own code to process this midi file, adjusting the volumes of individual notes, save it, then feed it back through garage band recording the sound. Can anyone point me to some code that would get me started processing midi thus?

Sam

PS if it is of any interest, this is what I am working on:

http://imagebin.org/125562

The difficulty I face is how to voice the chords... if I just do {C4 E4 G4} for the C major chord, and {G4 B5 D5} for G, etc, it is going to sound horrible

A pianist simply doesn't move from C to G like that. There is an art to voicing, so that each note attempts to move a minimal distance to its new resolution.

And I can't see any formula for depicting this in a way that is key agnostic.

So I am attempting instead to play all Cs Es and Gs, to create a sound texture for 'C major'

If I put all of the respective amplitudes under a bell curve, each major or minor chord should have its energy centred around the same point, so the effect would be that the texture changes without giving any overt / crude impression of moving up / down

Does this make some sense now? The task becomes: how to construct 24 textures?


If you are still looking for a solution, iOS5 (finally!) includes the Core Audio MusicPlayer. - it can read/write midi files, load soundfonts, play back midi with a variety of option.

Mac MusicPlayer Developer Reference

iOS MusicPlayer Developer Reference


You could probably peruse Stephan Bernsee's example code that comes with his free Dirac LE time stretching library. It has two classes, EAFRead and EAFWrite that read/convert/write audio files. You can get it from http://dirac.dspdimension.com. Check out the mobile/iOS folder that comes with the library. You can simply replace the call to his library with anything you want (like sample rate conversion to transpose sound, check out http://www.musicdsp.com for hints/code that does this).


You could create all the chords in Garage Band and then export the sounds from there to be used in your app. (Although your app will be larger with all the digitized audio.)

0

精彩评论

暂无评论...
验证码 换一张
取 消