Extending the Basic Interactive 3D Environment 

Get Complete Project Material File(s) Now! »

Chapter 2 Creating an Interactive 3D Environment

Chapter 2 starts by outlining the general design of a generic game engine (with the aim of providing background information on 3D engine design). Focus then shifts to the implementation of a basic DirectX 10 3D interactive environment featuring mesh-loading, texture mapping, movable light sources, a GUI and stencil shadow volumes (as this study is conducted through the implementation of such a system).
Outline:

  • Game engine architecture
  • Game initialisation and shutdown
  • The game loop
  • Creating a basic interactive DirectX 10 3D environment

Game Engine Architecture

A game engine is the central unit of any computer game and it can be described as a collection of technologies such as a sound engine, AI subsystem, physics engine, networking subsystem, 3D renderer, input control system, etc. The number of subsystems provided is highly dependant on the developer’s requirements and the implementation platform of choice.
Game engines, built upon various APIs such as DirectX and OpenGL, are normally designed with software componentry in mind. This allows for decomposition of the engine, resulting in numerous functional units. By designing component-based engines, we are able to replace provided technologies with other third-party or in-house developed units as needed. For example, a game engine’s renderer, physics engine or sound system can easily be replaced by an improved or alternate version in a plug-and-play fashion.
The term “game engine” has existed for some time now, but only became truly common in the mid-1990s when developers started licensing the core code of other games for their own titles. This reuse led to the development of high-end commercial game engines and middleware providing game developers with a number of game creation tools and technical components – i.e. accelerating the game development process. The following list gives some idea of what might be supported by a commercially targeted game engine:

D Engine:

  • Direct3D 10 renderer for Microsoft Windows based systems
  • OpenGL renderer for MacOS X, Linux, Unix, etc
  • High Level Shading Language (HLSL) and C for Graphics (Cg) shader support
  • Normal mapping
  • Environmental mapping
  • Displacement mapping
  • High Dynamic Range lighting
  • Depth-of-field
  • Motion blur
  • Bloom and sobel effects (for older hardware support)
  • Rome algorithmic based Level Of Detail automatic adaptation system
  • Dynamic lighting and shadowing
  • Soft shadows
  • Specular reflections with specular bump maps
  • Reflective water (with refraction)
  • Highly efficient occlusion culling
  • Dynamically deformable and destroyable geometry
  • Cg rendered moving grass, trees, fur, hair, etc
  • Advanced Particle System: model and sprite based (snow, smoke, sparks, rain, ice storms, fire storms, volumetric clouds, weather system, etc)
  • Non-Player Character (NPC) Material Interaction System (vehicle sliding on ice, etc)

Artificial Intelligence (AI) Subsystem

  • Cognitive model based NPC AI (no way-point system)
  • Intelligent non-combat and combat NPC interaction
  • Conversation system
  • NPCs make decision to fight, dodge, flee, hide, burrow, etc based on player resistance
  • NPCs fall back to regroup if resistance is overwhelming

Sound Engine

  • Stereo, 5.1 surround sound, quadraphonic sound, 3D spatialisation
  • Ogg (the open audio container format) and adaptive differential pulse-code modulation (ADPCM) decompression
  • Real-time audio file stitching (Ogg and Wave)
  • Distant variant distortion
  • Material based distortion (e.g. under water distortion of helicopter hovering overhead)
  • Environmental DSP (Digital Signal Processing)

Physics Engine

  • Realistic object interaction based on Newton’s Laws
  • Particle system inherits from Physics Engine
  • NPCs interact with objects realistically
  • All objects react based on force exerted and environmental resistance

Networking System

  • Up to 64-player LAN and 32-player internet support
  • High-latency, high-packet loss optimisations
  • Predictive collision detection performance enhancement

Development

  • In-game level and terrain editor
  • Exporters (meshes, brushes, etc)
  • C++ written code compiled to modular design
  • Event debugger and monitoring tools built into engine
  • Shader editor

Creating a game engine supporting all the above listed elements takes a lot of time, money, skilled developers and support infrastructure. However, most of the listed features can be added to an engine in a pluggable fashion. Hence, designing and implementing a basic first-person shooter game engine can be done by one programmer, time being the only limit in regard to the number of supported features. It is thus of critical importance to have a well-defined architecture, without which the source code of an engine would not be extendible, maintainable or easily understandable.
The source code of a game can be divided into two units, namely, the game-engine code and the game-specific code. The game-specific code deals exclusively with in-game play elements, for instance, the behaviour of non-player characters, mission-based events and logic, the main menu, etc. Game-specific code is not intended for future re-use and thus excluded from the game engine code. Game-engine code forms the core of the entire game implementation with the game-specific code being executed on top of it. The game engine is separate from the game being developed in the sense that it provides all the technological components without any hard coded information about the actual gameplay. Game-specific and engine-specific code are commonly compiled to dynamic-link libraries for easy distribution, modification and updating.
Game-engine code and game-specific code can be designed and integrated using one of the following architectures: ad-hoc, modular or the directed acyclic graph architecture (DAG).
Ad-hoc architecture describes a code base developed without any specific direction or logical organisation (Eberly, 2001). For example, a developer simply adds features to a game engine on an “as-needed” basis. This form of code organisation leads to very tight coupling (a high level of dependency) between the game-specific and game-engine code – something that is acceptable in small game projects such as mobile and casual games.
Modular architecture organises the code base into modules or libraries with a module consisting of numerous functions available for use by other modules or libraries (Flynt and Salem, 2004). Using this design, we are able to add and change modules as needed. Middleware such as a third-party physics engine can also easily be integrated into a modular designed code base. Modular organisation results in moderate coupling between the various code components. However, one must take care to limit inter-module communication to avoid a situation where every module is communicating with every other module – leading to a tighter level of coupling. Figure 2.1 illustrates the modular organisation of a code base.
A directed acyclic graph architecture is a modular architecture where the inter-module dependencies are strictly regulated. A direct acyclic graph is a directed graph without any directed cycles. What this means is that for every node in the graph, there should not be any circular dependencies. For example, if the input module depicted in Figure 2.1 depends on the game state module, then the game state module cannot depend on any of the other modules that depend on the input module. The directed acyclic graph architecture is thus used to create a hierarchical design where some modules are classified on a higher level that others. This hierarchical structure, shown in Figure 2.2, ensures relative loose coupling.
Other architectures also exist, each providing a different level of coupling and inter-module communication with the choice in architecture varying from application to application.
Once we have chosen the preferred overall architecture, we have to summarise all possible states our game will go through from initialisation to shutdown. Possible states (with associated events) are listed here:

  1. Enter the main game loop:
  2. Additional initialisation and memory allocation.
  3. Load introductory video.
  4. Initialise and display in-game menu:
  5. Event monitoring.
  6. Process user input.
  7. Start game.
  8. In-game loop:
  9. Input monitoring.
  10. Execution of AI
  11. Execution of physics routines.
  12. Sound and music output.
  13. Execution of game logic.
  14. Rendering of the scene based on the input from the user and other subsystems.
  15. Display synchronisation.
  16. Update game state.
  17. Exit the game and return to the in-game menu.
  18. Shutdown of the game if the user wishes to terminate the program.
READ  Connections between research questions and theoretical framework

These states will now be investigated in more detail. As mentioned, this section deals with the general design and implementation of a generic game engine which serves as the core of the proposed dynamically scalable interactive rendering engine. The next section will show how the engine allows for basic input control in the form of user-movable light sources, first-person camera and mesh. Its rendering capabilities come from the algorithms presented in Section 2.3. This extended rendering engine features dynamic algorithm swapping of shadow rendering algorithms, shaders, local illumination configurations, a number of reflection and refraction implementations and approaches, physics algorithms, a particle effect system and numerous post-processing effects. The CPU-GPU process allocation sub-system, as previously mentioned, is used to control performance and quality and serves chiefly as proof of concept. It is only used for CPU-based cube mapping (the real-time allocation of the presented cube mapping approach), PhysX-based physics calculations and the execution of the presented particle system (illustrating that the CPU can, in practice and under significant load, be used to free up valuable GPU resources).
All these implemented algorithms are presented and discussed at a source code level – a means of presentation starting below.

Abstract 
Acknowledgements 
Preface 
Part I 1
Chapter 1: Introduction 
1.1 Research Domain
1.2 Problem Statement
1.3 Dissertation Structure
Chapter 2: Creating an Interactive 3D Environment 
2.1 Game Engine Architecture
2.2 Initialisation and Shutdown
2.3 The Game Loop
2.4 Creating a Basic Interactive DirectX 10 3D Environment
2.5 Summary
Chapter 3: Extending the Basic Interactive 3D Environment 
3.1 Extending the Basic Interactive DirectX 10 3D Environment
3.2 Shaders
3.3 Local Illumination
3.4 Reflection and Refraction
3.4.1 Implementing Cube Mapping
3.4.2 Implementing Basic Refraction
3.4.3 Reflection and Refraction Extended
3.5 Adding High Dynamic Range (HDR) Lighting
3.6 Shadows
3.6.1 Stencil Shadow Volumes
3.6.2 Implementing Shadow Mapping
3.6.3 Hybrid and Derived Approaches
3.7 Physics
3.7.1 The Role of Newton’s Laws
3.7.2 Particle Effects
3.7.3 Particle System Implementation
3.8 Post-Processing
3.9 Summary
Part II 
Chapter 4: Benchmarking the Rendering Algorithms and Techniques
4.1 Benchmarking Mechanism
4.2 Rendering Subsystem Evaluation Criteria
4.3 Algorithm Comparison
4.3.1 Shadows
4.3.2 Shaders
4.3.3 Local Illumination
4.3.4 Reflection and Refraction
4.3.5 Physics
4.3.6 Particle Effects
4.3.7 Post-Processing
4.4 Summary
Chapter 5: An Empirically Derived System for Distributed Rendering 
5.1 Introduction
5.2 The Selection Engine and the Dynamic Selection and Allocation of Algorithms
5.2.1 Shadows
5.2.2 Shaders
5.2.3 Local Illumination
5.2.4 Reflection and Refraction
5.2.5 Physics
5.2.6 Particle Effects
5.2.7 Post-Processing
5.3 Construction of the Algorithm Selection Mechanism
5.4 Results
5.5 Summary
Chapter 6: Summary and Conclusion 
6.1 Summary
6.2 Concluding Remarks and Future Work
References 
Appendix
GET THE COMPLETE PROJECT

Related Posts