If this is your first time here, you can start at the beginning, and then work your way forward though the archive links. -------------------------------------------------------
Thursday, April 20, 2006
ISACT
One of the spoils of GDC this year was a chance to see 4 sets of tools used in the implementation of interactive audio.
For the record they were:
All of them were up to the task of bringing game audio accessibility to the next level. While they all shared a base level of parameter adjustments and interactive tweaks, they also specialized in one or more areas: FMOD had a beautiful visual front end, XACT had the power of being tied directly to know hardware and integrated visual in-game testing modes, wWise (new this year) brought powerful reporting stats and a clean and minimalist design aesthetic, and ISACT looks to do well overall.
With a basis in OpenAL and OggVorbis, ISACT looks to be the perfect fit for what we intend to do on the audio side with 0 A.D.. Theories for tools and schemes for implementation have already been discussed, and the integration of ISACT would seem to coincide with previous plans without the laborious need for custom tool building.
There is still the need for integration of ISACT with what is currently implemented, programmers will need to “look under the hood” a bit, systems set up to make implementation smooth, and adherence to definitions already set up on the wiki.
A quick overview of ISACT bullet points.
Run-time Mixing
Audio Samples
I made a short movie using the ambiences that Avenue Audio has provided.
In this example you'll see 2 things at work:
There are 2 "Sound_Events":
These are loaded up with 4 agricultural ambients (same in either sound event).
These Sound_Events playback the 4 files in various ways:
1. In a user specified order.
2. By chance, with various ways to determine the potential of them playing
(with provisions for preventing repeats)
There is then a "Sound_Entity":
Created in order to enable multiple “players/channels/streams” to play simultaneously based on definable action variables like state changes, etc.
The sound entity is being used in this example to control when either 1 set of Sound_Events are being played (based on the parameters for randomization), or both sets play simultaneously (based on the Sound_Entity action definitions).
Watch the example here.
ISACT TEST EXAMPLE MOVIE AGRICULTURAL AMBIENCE LAYERS
Did that make sense?
So, this was just one example of how interactive audio gets implemented by ISACT, and keeps that creative control in the hands of the Audio Team.
It then becomes a programming issue of setting up the hooks into OpenAL calls, pointing to established 3D buffers, setting up Real-time Parameter controls for world based coefficient passing, and a ton of other things.
Thanks for following along, y'all!
Until next time!
One of the spoils of GDC this year was a chance to see 4 sets of tools used in the implementation of interactive audio.
For the record they were:
All of them were up to the task of bringing game audio accessibility to the next level. While they all shared a base level of parameter adjustments and interactive tweaks, they also specialized in one or more areas: FMOD had a beautiful visual front end, XACT had the power of being tied directly to know hardware and integrated visual in-game testing modes, wWise (new this year) brought powerful reporting stats and a clean and minimalist design aesthetic, and ISACT looks to do well overall.
With a basis in OpenAL and OggVorbis, ISACT looks to be the perfect fit for what we intend to do on the audio side with 0 A.D.. Theories for tools and schemes for implementation have already been discussed, and the integration of ISACT would seem to coincide with previous plans without the laborious need for custom tool building.
There is still the need for integration of ISACT with what is currently implemented, programmers will need to “look under the hood” a bit, systems set up to make implementation smooth, and adherence to definitions already set up on the wiki.
A quick overview of ISACT bullet points.
Run-time Mixing
- The ISACT Production studio
- The ISACT Build Utility
- The ISACT Run-time
Audio Samples
- 3D Buffer Control
- Global Effects
- Spatial Paths
- Sound Events
- Audio Sequences
- Timelines
- Sound Queues
- Real-time Parameter Controls
- Transitions
- Sound Randomisers
- Sound Entities
I made a short movie using the ambiences that Avenue Audio has provided.
In this example you'll see 2 things at work:
There are 2 "Sound_Events":
These are loaded up with 4 agricultural ambients (same in either sound event).
These Sound_Events playback the 4 files in various ways:
1. In a user specified order.
2. By chance, with various ways to determine the potential of them playing
(with provisions for preventing repeats)
There is then a "Sound_Entity":
Created in order to enable multiple “players/channels/streams” to play simultaneously based on definable action variables like state changes, etc.
The sound entity is being used in this example to control when either 1 set of Sound_Events are being played (based on the parameters for randomization), or both sets play simultaneously (based on the Sound_Entity action definitions).
Watch the example here.
ISACT TEST EXAMPLE MOVIE AGRICULTURAL AMBIENCE LAYERS
Did that make sense?
So, this was just one example of how interactive audio gets implemented by ISACT, and keeps that creative control in the hands of the Audio Team.
It then becomes a programming issue of setting up the hooks into OpenAL calls, pointing to established 3D buffers, setting up Real-time Parameter controls for world based coefficient passing, and a ton of other things.
Thanks for following along, y'all!
Until next time!
0 Comments:
Post a Comment
Links to this post:
Create a Link
<< Home